Feb 26 19:50:15 localhost kernel: Linux version 5.14.0-686.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Feb 19 10:49:27 UTC 2026
Feb 26 19:50:15 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 26 19:50:15 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 26 19:50:15 localhost kernel: BIOS-provided physical RAM map:
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 26 19:50:15 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 26 19:50:15 localhost kernel: NX (Execute Disable) protection: active
Feb 26 19:50:15 localhost kernel: APIC: Static calls initialized
Feb 26 19:50:15 localhost kernel: SMBIOS 2.8 present.
Feb 26 19:50:15 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 26 19:50:15 localhost kernel: Hypervisor detected: KVM
Feb 26 19:50:15 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 26 19:50:15 localhost kernel: kvm-clock: using sched offset of 8911049290 cycles
Feb 26 19:50:15 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 26 19:50:15 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 26 19:50:15 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 26 19:50:15 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 26 19:50:15 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 26 19:50:15 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 26 19:50:15 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 26 19:50:15 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 26 19:50:15 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 26 19:50:15 localhost kernel: Using GB pages for direct mapping
Feb 26 19:50:15 localhost kernel: RAMDISK: [mem 0x1b6ca000-0x29b5cfff]
Feb 26 19:50:15 localhost kernel: ACPI: Early table checksum verification disabled
Feb 26 19:50:15 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 26 19:50:15 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 26 19:50:15 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 26 19:50:15 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 26 19:50:15 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 26 19:50:15 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 26 19:50:15 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 26 19:50:15 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 26 19:50:15 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 26 19:50:15 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 26 19:50:15 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 26 19:50:15 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 26 19:50:15 localhost kernel: No NUMA configuration found
Feb 26 19:50:15 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 26 19:50:15 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 26 19:50:15 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 26 19:50:15 localhost kernel: Zone ranges:
Feb 26 19:50:15 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 26 19:50:15 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 26 19:50:15 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 26 19:50:15 localhost kernel:   Device   empty
Feb 26 19:50:15 localhost kernel: Movable zone start for each node
Feb 26 19:50:15 localhost kernel: Early memory node ranges
Feb 26 19:50:15 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 26 19:50:15 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 26 19:50:15 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 26 19:50:15 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 26 19:50:15 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 26 19:50:15 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 26 19:50:15 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 26 19:50:15 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 26 19:50:15 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 26 19:50:15 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 26 19:50:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 26 19:50:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 26 19:50:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 26 19:50:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 26 19:50:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 26 19:50:15 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 26 19:50:15 localhost kernel: TSC deadline timer available
Feb 26 19:50:15 localhost kernel: CPU topo: Max. logical packages:   8
Feb 26 19:50:15 localhost kernel: CPU topo: Max. logical dies:       8
Feb 26 19:50:15 localhost kernel: CPU topo: Max. dies per package:   1
Feb 26 19:50:15 localhost kernel: CPU topo: Max. threads per core:   1
Feb 26 19:50:15 localhost kernel: CPU topo: Num. cores per package:     1
Feb 26 19:50:15 localhost kernel: CPU topo: Num. threads per package:   1
Feb 26 19:50:15 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 26 19:50:15 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 26 19:50:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 26 19:50:15 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 26 19:50:15 localhost kernel: Booting paravirtualized kernel on KVM
Feb 26 19:50:15 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 26 19:50:15 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 26 19:50:15 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 26 19:50:15 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 26 19:50:15 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 26 19:50:15 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 26 19:50:15 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 26 19:50:15 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64", will be passed to user space.
Feb 26 19:50:15 localhost kernel: random: crng init done
Feb 26 19:50:15 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 26 19:50:15 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 26 19:50:15 localhost kernel: Fallback order for Node 0: 0 
Feb 26 19:50:15 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 26 19:50:15 localhost kernel: Policy zone: Normal
Feb 26 19:50:15 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 26 19:50:15 localhost kernel: software IO TLB: area num 8.
Feb 26 19:50:15 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 26 19:50:15 localhost kernel: ftrace: allocating 49605 entries in 194 pages
Feb 26 19:50:15 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 26 19:50:15 localhost kernel: Dynamic Preempt: voluntary
Feb 26 19:50:15 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 26 19:50:15 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 26 19:50:15 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 26 19:50:15 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 26 19:50:15 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 26 19:50:15 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 26 19:50:15 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 26 19:50:15 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 26 19:50:15 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 26 19:50:15 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 26 19:50:15 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 26 19:50:15 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 26 19:50:15 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 26 19:50:15 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 26 19:50:15 localhost kernel: Console: colour VGA+ 80x25
Feb 26 19:50:15 localhost kernel: printk: console [ttyS0] enabled
Feb 26 19:50:15 localhost kernel: ACPI: Core revision 20230331
Feb 26 19:50:15 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 26 19:50:15 localhost kernel: x2apic enabled
Feb 26 19:50:15 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 26 19:50:15 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 26 19:50:15 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 26 19:50:15 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 26 19:50:15 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 26 19:50:15 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 26 19:50:15 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 26 19:50:15 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 26 19:50:15 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 26 19:50:15 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 26 19:50:15 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 26 19:50:15 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 26 19:50:15 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 26 19:50:15 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 26 19:50:15 localhost kernel: active return thunk: retbleed_return_thunk
Feb 26 19:50:15 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 26 19:50:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 26 19:50:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 26 19:50:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 26 19:50:15 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 26 19:50:15 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 26 19:50:15 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 26 19:50:15 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 26 19:50:15 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 26 19:50:15 localhost kernel: landlock: Up and running.
Feb 26 19:50:15 localhost kernel: Yama: becoming mindful.
Feb 26 19:50:15 localhost kernel: SELinux:  Initializing.
Feb 26 19:50:15 localhost kernel: LSM support for eBPF active
Feb 26 19:50:15 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 26 19:50:15 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 26 19:50:15 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 26 19:50:15 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 26 19:50:15 localhost kernel: ... version:                0
Feb 26 19:50:15 localhost kernel: ... bit width:              48
Feb 26 19:50:15 localhost kernel: ... generic registers:      6
Feb 26 19:50:15 localhost kernel: ... value mask:             0000ffffffffffff
Feb 26 19:50:15 localhost kernel: ... max period:             00007fffffffffff
Feb 26 19:50:15 localhost kernel: ... fixed-purpose events:   0
Feb 26 19:50:15 localhost kernel: ... event mask:             000000000000003f
Feb 26 19:50:15 localhost kernel: signal: max sigframe size: 1776
Feb 26 19:50:15 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 26 19:50:15 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 26 19:50:15 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 26 19:50:15 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 26 19:50:15 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 26 19:50:15 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 26 19:50:15 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 26 19:50:15 localhost kernel: node 0 deferred pages initialised in 8ms
Feb 26 19:50:15 localhost kernel: Memory: 7617448K/8388068K available (16384K kernel code, 5797K rwdata, 13956K rodata, 4204K init, 7172K bss, 764464K reserved, 0K cma-reserved)
Feb 26 19:50:15 localhost kernel: devtmpfs: initialized
Feb 26 19:50:15 localhost kernel: x86/mm: Memory block size: 128MB
Feb 26 19:50:15 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 26 19:50:15 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 26 19:50:15 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 26 19:50:15 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 26 19:50:15 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 26 19:50:15 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 26 19:50:15 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 26 19:50:15 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 26 19:50:15 localhost kernel: audit: type=2000 audit(1772135414.307:1): state=initialized audit_enabled=0 res=1
Feb 26 19:50:15 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 26 19:50:15 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 26 19:50:15 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 26 19:50:15 localhost kernel: cpuidle: using governor menu
Feb 26 19:50:15 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 26 19:50:15 localhost kernel: PCI: Using configuration type 1 for base access
Feb 26 19:50:15 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 26 19:50:15 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 26 19:50:15 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 26 19:50:15 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 26 19:50:15 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 26 19:50:15 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 26 19:50:15 localhost kernel: Demotion targets for Node 0: null
Feb 26 19:50:15 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 26 19:50:15 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 26 19:50:15 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 26 19:50:15 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 26 19:50:15 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 26 19:50:15 localhost kernel: ACPI: Interpreter enabled
Feb 26 19:50:15 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 26 19:50:15 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 26 19:50:15 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 26 19:50:15 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 26 19:50:15 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 26 19:50:15 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 26 19:50:15 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [3] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [4] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [5] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [6] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [7] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [8] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [9] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [10] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [11] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [12] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [13] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [14] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [15] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [16] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [17] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [18] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [19] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [20] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [21] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [22] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [23] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [24] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [25] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [26] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [27] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [28] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [29] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [30] registered
Feb 26 19:50:15 localhost kernel: acpiphp: Slot [31] registered
Feb 26 19:50:15 localhost kernel: PCI host bridge to bus 0000:00
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 26 19:50:15 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 26 19:50:15 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 26 19:50:15 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 26 19:50:15 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 26 19:50:15 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 26 19:50:15 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 26 19:50:15 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 26 19:50:15 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 26 19:50:15 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 26 19:50:15 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 26 19:50:15 localhost kernel: iommu: Default domain type: Translated
Feb 26 19:50:15 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 26 19:50:15 localhost kernel: SCSI subsystem initialized
Feb 26 19:50:15 localhost kernel: ACPI: bus type USB registered
Feb 26 19:50:15 localhost kernel: usbcore: registered new interface driver usbfs
Feb 26 19:50:15 localhost kernel: usbcore: registered new interface driver hub
Feb 26 19:50:15 localhost kernel: usbcore: registered new device driver usb
Feb 26 19:50:15 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 26 19:50:15 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 26 19:50:15 localhost kernel: PTP clock support registered
Feb 26 19:50:15 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 26 19:50:15 localhost kernel: NetLabel: Initializing
Feb 26 19:50:15 localhost kernel: NetLabel:  domain hash size = 128
Feb 26 19:50:15 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 26 19:50:15 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 26 19:50:15 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 26 19:50:15 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 26 19:50:15 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 26 19:50:15 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 26 19:50:15 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 26 19:50:15 localhost kernel: vgaarb: loaded
Feb 26 19:50:15 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 26 19:50:15 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 26 19:50:15 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 26 19:50:15 localhost kernel: pnp: PnP ACPI init
Feb 26 19:50:15 localhost kernel: pnp 00:03: [dma 2]
Feb 26 19:50:15 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 26 19:50:15 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 26 19:50:15 localhost kernel: NET: Registered PF_INET protocol family
Feb 26 19:50:15 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 26 19:50:15 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 26 19:50:15 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 26 19:50:15 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 26 19:50:15 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 26 19:50:15 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 26 19:50:15 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 26 19:50:15 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 26 19:50:15 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 26 19:50:15 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 26 19:50:15 localhost kernel: NET: Registered PF_XDP protocol family
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 26 19:50:15 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 26 19:50:15 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 26 19:50:15 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 26 19:50:15 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 49859 usecs
Feb 26 19:50:15 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 26 19:50:15 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 26 19:50:15 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 26 19:50:15 localhost kernel: ACPI: bus type thunderbolt registered
Feb 26 19:50:15 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 26 19:50:15 localhost kernel: Initialise system trusted keyrings
Feb 26 19:50:15 localhost kernel: Key type blacklist registered
Feb 26 19:50:15 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 26 19:50:15 localhost kernel: zbud: loaded
Feb 26 19:50:15 localhost kernel: integrity: Platform Keyring initialized
Feb 26 19:50:15 localhost kernel: integrity: Machine keyring initialized
Feb 26 19:50:15 localhost kernel: Freeing initrd memory: 234060K
Feb 26 19:50:15 localhost kernel: NET: Registered PF_ALG protocol family
Feb 26 19:50:15 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 26 19:50:15 localhost kernel: Key type asymmetric registered
Feb 26 19:50:15 localhost kernel: Asymmetric key parser 'x509' registered
Feb 26 19:50:15 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 26 19:50:15 localhost kernel: io scheduler mq-deadline registered
Feb 26 19:50:15 localhost kernel: io scheduler kyber registered
Feb 26 19:50:15 localhost kernel: io scheduler bfq registered
Feb 26 19:50:15 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 26 19:50:15 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 26 19:50:15 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 26 19:50:15 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 26 19:50:15 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 26 19:50:15 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 26 19:50:15 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 26 19:50:15 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 26 19:50:15 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 26 19:50:15 localhost kernel: Non-volatile memory driver v1.3
Feb 26 19:50:15 localhost kernel: rdac: device handler registered
Feb 26 19:50:15 localhost kernel: hp_sw: device handler registered
Feb 26 19:50:15 localhost kernel: emc: device handler registered
Feb 26 19:50:15 localhost kernel: alua: device handler registered
Feb 26 19:50:15 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 26 19:50:15 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 26 19:50:15 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 26 19:50:15 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 26 19:50:15 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 26 19:50:15 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 26 19:50:15 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 26 19:50:15 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-686.el9.x86_64 uhci_hcd
Feb 26 19:50:15 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 26 19:50:15 localhost kernel: hub 1-0:1.0: USB hub found
Feb 26 19:50:15 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 26 19:50:15 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 26 19:50:15 localhost kernel: usbserial: USB Serial support registered for generic
Feb 26 19:50:15 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 26 19:50:15 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 26 19:50:15 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 26 19:50:15 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 26 19:50:15 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 26 19:50:15 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 26 19:50:15 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 26 19:50:15 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 26 19:50:15 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 26 19:50:15 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-26T19:50:14 UTC (1772135414)
Feb 26 19:50:15 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 26 19:50:15 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 26 19:50:15 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 26 19:50:15 localhost kernel: usbcore: registered new interface driver usbhid
Feb 26 19:50:15 localhost kernel: usbhid: USB HID core driver
Feb 26 19:50:15 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 26 19:50:15 localhost kernel: Initializing XFRM netlink socket
Feb 26 19:50:15 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 26 19:50:15 localhost kernel: Segment Routing with IPv6
Feb 26 19:50:15 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 26 19:50:15 localhost kernel: mpls_gso: MPLS GSO support
Feb 26 19:50:15 localhost kernel: IPI shorthand broadcast: enabled
Feb 26 19:50:15 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 26 19:50:15 localhost kernel: AES CTR mode by8 optimization enabled
Feb 26 19:50:15 localhost kernel: sched_clock: Marking stable (1172008151, 148485733)->(1445945246, -125451362)
Feb 26 19:50:15 localhost kernel: registered taskstats version 1
Feb 26 19:50:15 localhost kernel: Loading compiled-in X.509 certificates
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: d9d4cefd3ca2c4957ef0b2e7c6e39a7e4ae16390'
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 26 19:50:15 localhost kernel: Demotion targets for Node 0: null
Feb 26 19:50:15 localhost kernel: page_owner is disabled
Feb 26 19:50:15 localhost kernel: Key type .fscrypt registered
Feb 26 19:50:15 localhost kernel: Key type fscrypt-provisioning registered
Feb 26 19:50:15 localhost kernel: Key type big_key registered
Feb 26 19:50:15 localhost kernel: Key type encrypted registered
Feb 26 19:50:15 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 26 19:50:15 localhost kernel: Loading compiled-in module X.509 certificates
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: d9d4cefd3ca2c4957ef0b2e7c6e39a7e4ae16390'
Feb 26 19:50:15 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 26 19:50:15 localhost kernel: ima: No architecture policies found
Feb 26 19:50:15 localhost kernel: evm: Initialising EVM extended attributes:
Feb 26 19:50:15 localhost kernel: evm: security.selinux
Feb 26 19:50:15 localhost kernel: evm: security.SMACK64 (disabled)
Feb 26 19:50:15 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 26 19:50:15 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 26 19:50:15 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 26 19:50:15 localhost kernel: evm: security.apparmor (disabled)
Feb 26 19:50:15 localhost kernel: evm: security.ima
Feb 26 19:50:15 localhost kernel: evm: security.capability
Feb 26 19:50:15 localhost kernel: evm: HMAC attrs: 0x1
Feb 26 19:50:15 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 26 19:50:15 localhost kernel: Running certificate verification RSA selftest
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 26 19:50:15 localhost kernel: Running certificate verification ECDSA selftest
Feb 26 19:50:15 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 26 19:50:15 localhost kernel: clk: Disabling unused clocks
Feb 26 19:50:15 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 26 19:50:15 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 26 19:50:15 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 26 19:50:15 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 380K
Feb 26 19:50:15 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 26 19:50:15 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 26 19:50:15 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 26 19:50:15 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 26 19:50:15 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 26 19:50:15 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 26 19:50:15 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 26 19:50:15 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 26 19:50:15 localhost kernel: Run /init as init process
Feb 26 19:50:15 localhost kernel:   with arguments:
Feb 26 19:50:15 localhost kernel:     /init
Feb 26 19:50:15 localhost kernel:   with environment:
Feb 26 19:50:15 localhost kernel:     HOME=/
Feb 26 19:50:15 localhost kernel:     TERM=linux
Feb 26 19:50:15 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64
Feb 26 19:50:15 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 26 19:50:15 localhost systemd[1]: Detected virtualization kvm.
Feb 26 19:50:15 localhost systemd[1]: Detected architecture x86-64.
Feb 26 19:50:15 localhost systemd[1]: Running in initrd.
Feb 26 19:50:15 localhost systemd[1]: No hostname configured, using default hostname.
Feb 26 19:50:15 localhost systemd[1]: Hostname set to <localhost>.
Feb 26 19:50:15 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 26 19:50:15 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 26 19:50:15 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 26 19:50:15 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 26 19:50:15 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 26 19:50:15 localhost systemd[1]: Reached target Local File Systems.
Feb 26 19:50:15 localhost systemd[1]: Reached target Path Units.
Feb 26 19:50:15 localhost systemd[1]: Reached target Slice Units.
Feb 26 19:50:15 localhost systemd[1]: Reached target Swaps.
Feb 26 19:50:15 localhost systemd[1]: Reached target Timer Units.
Feb 26 19:50:15 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 26 19:50:15 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 26 19:50:15 localhost systemd[1]: Listening on Journal Socket.
Feb 26 19:50:15 localhost systemd[1]: Listening on udev Control Socket.
Feb 26 19:50:15 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 26 19:50:15 localhost systemd[1]: Reached target Socket Units.
Feb 26 19:50:15 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 26 19:50:15 localhost systemd[1]: Starting Journal Service...
Feb 26 19:50:15 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 26 19:50:15 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 26 19:50:15 localhost systemd[1]: Starting Create System Users...
Feb 26 19:50:15 localhost systemd[1]: Starting Setup Virtual Console...
Feb 26 19:50:15 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 26 19:50:15 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 26 19:50:15 localhost systemd-journald[307]: Journal started
Feb 26 19:50:15 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/35e489ed3c6448cc802f42161f451b28) is 8.0M, max 153.6M, 145.6M free.
Feb 26 19:50:15 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Feb 26 19:50:15 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Feb 26 19:50:15 localhost systemd[1]: Started Journal Service.
Feb 26 19:50:15 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 26 19:50:15 localhost systemd[1]: Finished Create System Users.
Feb 26 19:50:15 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 26 19:50:15 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 26 19:50:15 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 26 19:50:15 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 26 19:50:15 localhost systemd[1]: Finished Setup Virtual Console.
Feb 26 19:50:15 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 26 19:50:15 localhost systemd[1]: Starting dracut cmdline hook...
Feb 26 19:50:15 localhost dracut-cmdline[328]: dracut-9 dracut-057-110.git20260130.el9
Feb 26 19:50:15 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 26 19:50:15 localhost systemd[1]: Finished dracut cmdline hook.
Feb 26 19:50:15 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 26 19:50:15 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 26 19:50:15 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 26 19:50:15 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 26 19:50:15 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 26 19:50:15 localhost kernel: RPC: Registered udp transport module.
Feb 26 19:50:15 localhost kernel: RPC: Registered tcp transport module.
Feb 26 19:50:15 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 26 19:50:15 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 26 19:50:15 localhost rpc.statd[445]: Version 2.5.4 starting
Feb 26 19:50:15 localhost rpc.statd[445]: Initializing NSM state
Feb 26 19:50:15 localhost rpc.idmapd[450]: Setting log level to 0
Feb 26 19:50:15 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 26 19:50:15 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 26 19:50:15 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Feb 26 19:50:15 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 26 19:50:15 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 26 19:50:15 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 26 19:50:15 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 26 19:50:15 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 26 19:50:15 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 26 19:50:15 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 26 19:50:15 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 26 19:50:15 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 26 19:50:15 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 26 19:50:15 localhost systemd[1]: Reached target Network.
Feb 26 19:50:15 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 26 19:50:15 localhost systemd[1]: Starting dracut initqueue hook...
Feb 26 19:50:15 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 26 19:50:15 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 26 19:50:15 localhost kernel:  vda: vda1
Feb 26 19:50:15 localhost systemd-udevd[490]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 19:50:15 localhost kernel: libata version 3.00 loaded.
Feb 26 19:50:15 localhost systemd[1]: Found device /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b.
Feb 26 19:50:15 localhost kernel: ACPI: bus type drm_connector registered
Feb 26 19:50:15 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 26 19:50:15 localhost kernel: scsi host0: ata_piix
Feb 26 19:50:15 localhost kernel: scsi host1: ata_piix
Feb 26 19:50:15 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 26 19:50:15 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 26 19:50:16 localhost systemd[1]: Reached target Initrd Root Device.
Feb 26 19:50:16 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 26 19:50:16 localhost kernel: ata1: found unknown device (class 0)
Feb 26 19:50:16 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 26 19:50:16 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 26 19:50:16 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 26 19:50:16 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 26 19:50:16 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 26 19:50:16 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 26 19:50:16 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 26 19:50:16 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 26 19:50:16 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 26 19:50:16 localhost kernel: Console: switching to colour dummy device 80x25
Feb 26 19:50:16 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 26 19:50:16 localhost kernel: [drm] features: -context_init
Feb 26 19:50:16 localhost kernel: [drm] number of scanouts: 1
Feb 26 19:50:16 localhost kernel: [drm] number of cap sets: 0
Feb 26 19:50:16 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 26 19:50:16 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 26 19:50:16 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 26 19:50:16 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 26 19:50:16 localhost systemd[1]: Reached target System Initialization.
Feb 26 19:50:16 localhost systemd[1]: Reached target Basic System.
Feb 26 19:50:16 localhost systemd[1]: Finished dracut initqueue hook.
Feb 26 19:50:16 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 26 19:50:16 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 26 19:50:16 localhost systemd[1]: Reached target Remote File Systems.
Feb 26 19:50:16 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 26 19:50:16 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 26 19:50:16 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b...
Feb 26 19:50:16 localhost systemd-fsck[565]: /usr/sbin/fsck.xfs: XFS file system.
Feb 26 19:50:16 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b.
Feb 26 19:50:16 localhost systemd[1]: Mounting /sysroot...
Feb 26 19:50:16 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 26 19:50:16 localhost kernel: XFS (vda1): Mounting V5 Filesystem 37391a25-080d-4723-8b0c-cb88a559875b
Feb 26 19:50:16 localhost kernel: XFS (vda1): Ending clean mount
Feb 26 19:50:16 localhost systemd[1]: Mounted /sysroot.
Feb 26 19:50:16 localhost systemd[1]: Reached target Initrd Root File System.
Feb 26 19:50:16 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 26 19:50:16 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 26 19:50:16 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 26 19:50:16 localhost systemd[1]: Reached target Initrd File Systems.
Feb 26 19:50:16 localhost systemd[1]: Reached target Initrd Default Target.
Feb 26 19:50:16 localhost systemd[1]: Starting dracut mount hook...
Feb 26 19:50:16 localhost systemd[1]: Finished dracut mount hook.
Feb 26 19:50:16 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 26 19:50:17 localhost rpc.idmapd[450]: exiting on signal 15
Feb 26 19:50:17 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 26 19:50:17 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 26 19:50:17 localhost systemd[1]: Stopped target Network.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Timer Units.
Feb 26 19:50:17 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 26 19:50:17 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Basic System.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Path Units.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Remote File Systems.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Slice Units.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Socket Units.
Feb 26 19:50:17 localhost systemd[1]: Stopped target System Initialization.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Local File Systems.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Swaps.
Feb 26 19:50:17 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut mount hook.
Feb 26 19:50:17 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 26 19:50:17 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 26 19:50:17 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 26 19:50:17 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 26 19:50:17 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 26 19:50:17 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 26 19:50:17 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 26 19:50:17 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 26 19:50:17 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 26 19:50:17 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 26 19:50:17 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 26 19:50:17 localhost systemd[1]: systemd-udevd.service: Consumed 1.122s CPU time.
Feb 26 19:50:17 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 26 19:50:17 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Closed udev Control Socket.
Feb 26 19:50:17 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Closed udev Kernel Socket.
Feb 26 19:50:17 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 26 19:50:17 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 26 19:50:17 localhost systemd[1]: Starting Cleanup udev Database...
Feb 26 19:50:17 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 26 19:50:17 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 26 19:50:17 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Stopped Create System Users.
Feb 26 19:50:17 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 26 19:50:17 localhost systemd[1]: Finished Cleanup udev Database.
Feb 26 19:50:17 localhost systemd[1]: Reached target Switch Root.
Feb 26 19:50:17 localhost systemd[1]: Starting Switch Root...
Feb 26 19:50:17 localhost systemd[1]: Switching root.
Feb 26 19:50:17 localhost systemd-journald[307]: Journal stopped
Feb 26 19:50:18 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Feb 26 19:50:18 localhost kernel: audit: type=1404 audit(1772135417.466:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability open_perms=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 19:50:18 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 19:50:18 localhost kernel: audit: type=1403 audit(1772135417.577:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 26 19:50:18 localhost systemd[1]: Successfully loaded SELinux policy in 114.403ms.
Feb 26 19:50:18 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.515ms.
Feb 26 19:50:18 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 26 19:50:18 localhost systemd[1]: Detected virtualization kvm.
Feb 26 19:50:18 localhost systemd[1]: Detected architecture x86-64.
Feb 26 19:50:18 localhost systemd-rc-local-generator[645]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 19:50:18 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Stopped Switch Root.
Feb 26 19:50:18 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 26 19:50:18 localhost systemd[1]: Created slice Slice /system/getty.
Feb 26 19:50:18 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 26 19:50:18 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 26 19:50:18 localhost systemd[1]: Created slice User and Session Slice.
Feb 26 19:50:18 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 26 19:50:18 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 26 19:50:18 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 26 19:50:18 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 26 19:50:18 localhost systemd[1]: Stopped target Switch Root.
Feb 26 19:50:18 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 26 19:50:18 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 26 19:50:18 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 26 19:50:18 localhost systemd[1]: Reached target Path Units.
Feb 26 19:50:18 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 26 19:50:18 localhost systemd[1]: Reached target Slice Units.
Feb 26 19:50:18 localhost systemd[1]: Reached target Swaps.
Feb 26 19:50:18 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 26 19:50:18 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 26 19:50:18 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 26 19:50:18 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 26 19:50:18 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 26 19:50:18 localhost systemd[1]: Listening on udev Control Socket.
Feb 26 19:50:18 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 26 19:50:18 localhost systemd[1]: Mounting Huge Pages File System...
Feb 26 19:50:18 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 26 19:50:18 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 26 19:50:18 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 26 19:50:18 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 26 19:50:18 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 26 19:50:18 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 26 19:50:18 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 26 19:50:18 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 26 19:50:18 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 26 19:50:18 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 26 19:50:18 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 26 19:50:18 localhost systemd[1]: Stopped Journal Service.
Feb 26 19:50:18 localhost systemd[1]: Starting Journal Service...
Feb 26 19:50:18 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 26 19:50:18 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 26 19:50:18 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 26 19:50:18 localhost systemd-journald[693]: Journal started
Feb 26 19:50:18 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/45af4031c1bdc072f1f045c25038675f) is 8.0M, max 153.6M, 145.6M free.
Feb 26 19:50:17 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 26 19:50:17 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 26 19:50:18 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 26 19:50:18 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 26 19:50:18 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 26 19:50:18 localhost kernel: fuse: init (API version 7.37)
Feb 26 19:50:18 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 26 19:50:18 localhost systemd[1]: Started Journal Service.
Feb 26 19:50:18 localhost systemd[1]: Mounted Huge Pages File System.
Feb 26 19:50:18 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 26 19:50:18 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 26 19:50:18 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 26 19:50:18 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 26 19:50:18 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 26 19:50:18 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 26 19:50:18 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 26 19:50:18 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 26 19:50:18 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 26 19:50:18 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 26 19:50:18 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 26 19:50:18 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 26 19:50:18 localhost systemd[1]: Mounting FUSE Control File System...
Feb 26 19:50:18 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 26 19:50:18 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 26 19:50:18 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 26 19:50:18 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 26 19:50:18 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 26 19:50:18 localhost systemd[1]: Starting Create System Users...
Feb 26 19:50:18 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 26 19:50:18 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 26 19:50:18 localhost systemd[1]: Mounted FUSE Control File System.
Feb 26 19:50:18 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 26 19:50:18 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 26 19:50:18 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/45af4031c1bdc072f1f045c25038675f) is 8.0M, max 153.6M, 145.6M free.
Feb 26 19:50:18 localhost systemd-journald[693]: Received client request to flush runtime journal.
Feb 26 19:50:18 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 26 19:50:18 localhost systemd[1]: Finished Create System Users.
Feb 26 19:50:18 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 26 19:50:18 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 26 19:50:18 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 26 19:50:18 localhost systemd[1]: Reached target Local File Systems.
Feb 26 19:50:18 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 26 19:50:18 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 26 19:50:18 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 26 19:50:18 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 26 19:50:18 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 26 19:50:18 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 26 19:50:18 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 26 19:50:18 localhost bootctl[713]: Couldn't find EFI system partition, skipping.
Feb 26 19:50:18 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 26 19:50:18 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 26 19:50:18 localhost systemd[1]: Starting Security Auditing Service...
Feb 26 19:50:18 localhost systemd[1]: Starting RPC Bind...
Feb 26 19:50:18 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 26 19:50:18 localhost auditd[719]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 26 19:50:18 localhost auditd[719]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 26 19:50:18 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 26 19:50:18 localhost systemd[1]: Started RPC Bind.
Feb 26 19:50:18 localhost augenrules[724]: /sbin/augenrules: No change
Feb 26 19:50:18 localhost augenrules[739]: No rules
Feb 26 19:50:18 localhost augenrules[739]: enabled 1
Feb 26 19:50:18 localhost augenrules[739]: failure 1
Feb 26 19:50:18 localhost augenrules[739]: pid 719
Feb 26 19:50:18 localhost augenrules[739]: rate_limit 0
Feb 26 19:50:18 localhost augenrules[739]: backlog_limit 8192
Feb 26 19:50:18 localhost augenrules[739]: lost 0
Feb 26 19:50:18 localhost augenrules[739]: backlog 3
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time 60000
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 26 19:50:18 localhost augenrules[739]: enabled 1
Feb 26 19:50:18 localhost augenrules[739]: failure 1
Feb 26 19:50:18 localhost augenrules[739]: pid 719
Feb 26 19:50:18 localhost augenrules[739]: rate_limit 0
Feb 26 19:50:18 localhost augenrules[739]: backlog_limit 8192
Feb 26 19:50:18 localhost augenrules[739]: lost 0
Feb 26 19:50:18 localhost augenrules[739]: backlog 3
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time 60000
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 26 19:50:18 localhost augenrules[739]: enabled 1
Feb 26 19:50:18 localhost augenrules[739]: failure 1
Feb 26 19:50:18 localhost augenrules[739]: pid 719
Feb 26 19:50:18 localhost augenrules[739]: rate_limit 0
Feb 26 19:50:18 localhost augenrules[739]: backlog_limit 8192
Feb 26 19:50:18 localhost augenrules[739]: lost 0
Feb 26 19:50:18 localhost augenrules[739]: backlog 2
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time 60000
Feb 26 19:50:18 localhost augenrules[739]: backlog_wait_time_actual 0
Feb 26 19:50:18 localhost systemd[1]: Started Security Auditing Service.
Feb 26 19:50:18 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 26 19:50:18 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 26 19:50:18 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 26 19:50:18 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 26 19:50:18 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 26 19:50:18 localhost systemd[1]: Starting Update is Completed...
Feb 26 19:50:18 localhost systemd[1]: Finished Update is Completed.
Feb 26 19:50:18 localhost systemd-udevd[747]: Using default interface naming scheme 'rhel-9.0'.
Feb 26 19:50:19 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 26 19:50:19 localhost systemd[1]: Reached target System Initialization.
Feb 26 19:50:19 localhost systemd[1]: Started dnf makecache --timer.
Feb 26 19:50:19 localhost systemd[1]: Started Daily rotation of log files.
Feb 26 19:50:19 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 26 19:50:19 localhost systemd[1]: Reached target Timer Units.
Feb 26 19:50:19 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 26 19:50:19 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 26 19:50:19 localhost systemd[1]: Reached target Socket Units.
Feb 26 19:50:19 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 26 19:50:19 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 26 19:50:19 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 26 19:50:19 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 26 19:50:19 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 26 19:50:19 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 26 19:50:19 localhost systemd-udevd[753]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 19:50:19 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 26 19:50:19 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 26 19:50:19 localhost systemd[1]: Reached target Basic System.
Feb 26 19:50:19 localhost dbus-broker-lau[785]: Ready
Feb 26 19:50:19 localhost systemd[1]: Starting NTP client/server...
Feb 26 19:50:19 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 26 19:50:19 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 26 19:50:19 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 26 19:50:19 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 26 19:50:19 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 26 19:50:19 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 26 19:50:19 localhost chronyd[815]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 26 19:50:19 localhost chronyd[815]: Loaded 0 symmetric keys
Feb 26 19:50:19 localhost chronyd[815]: Using right/UTC timezone to obtain leap second data
Feb 26 19:50:19 localhost chronyd[815]: Loaded seccomp filter (level 2)
Feb 26 19:50:19 localhost systemd[1]: Started irqbalance daemon.
Feb 26 19:50:19 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 26 19:50:19 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 19:50:19 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 19:50:19 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 19:50:19 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 26 19:50:19 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 26 19:50:19 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 26 19:50:19 localhost systemd[1]: Starting User Login Management...
Feb 26 19:50:19 localhost systemd[1]: Started NTP client/server.
Feb 26 19:50:19 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 26 19:50:19 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 26 19:50:19 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 26 19:50:19 localhost kernel: kvm_amd: TSC scaling supported
Feb 26 19:50:19 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 26 19:50:19 localhost kernel: kvm_amd: Nested Paging enabled
Feb 26 19:50:19 localhost kernel: kvm_amd: LBR virtualization supported
Feb 26 19:50:19 localhost systemd-logind[825]: New seat seat0.
Feb 26 19:50:19 localhost systemd-logind[825]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 26 19:50:19 localhost systemd-logind[825]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 26 19:50:19 localhost systemd[1]: Started User Login Management.
Feb 26 19:50:19 localhost iptables.init[800]: iptables: Applying firewall rules: [  OK  ]
Feb 26 19:50:19 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 26 19:50:19 localhost cloud-init[851]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 26 Feb 2026 19:50:19 +0000. Up 6.39 seconds.
Feb 26 19:50:19 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 26 19:50:19 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 26 19:50:20 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpe9h93z1u.mount: Deactivated successfully.
Feb 26 19:50:20 localhost systemd[1]: Starting Hostname Service...
Feb 26 19:50:20 localhost systemd[1]: Started Hostname Service.
Feb 26 19:50:20 np0005631999.novalocal systemd-hostnamed[865]: Hostname set to <np0005631999.novalocal> (static)
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Reached target Preparation for Network.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Starting Network Manager...
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.3904] NetworkManager (version 1.54.3-2.el9) is starting... (boot:f3703019-5d3d-46b0-a4dd-d38ee7f05396)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.3910] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4102] manager[0x55d43a188000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4170] hostname: hostname: using hostnamed
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4170] hostname: static hostname changed from (none) to "np0005631999.novalocal"
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4174] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4323] manager[0x55d43a188000]: rfkill: Wi-Fi hardware radio set enabled
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4323] manager[0x55d43a188000]: rfkill: WWAN hardware radio set enabled
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4413] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4416] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4416] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4417] manager: Networking is enabled by state file
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4418] settings: Loaded settings plugin: keyfile (internal)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4447] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4475] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4491] dhcp: init: Using DHCP client 'internal'
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4495] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4511] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4521] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4531] device (lo): Activation: starting connection 'lo' (dd0b54dd-8e74-4c4e-991a-59513cc199d2)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4538] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4542] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4568] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4574] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4577] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4579] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4582] device (eth0): carrier: link connected
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4585] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4593] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4601] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4605] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4607] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4611] manager: NetworkManager state is now CONNECTING
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4613] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4620] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4623] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4661] dhcp4 (eth0): state changed new lease, address=38.102.83.12
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4669] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Started Network Manager.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4691] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Reached target Network.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4924] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4927] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4930] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4941] device (lo): Activation: successful, device activated.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4953] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4961] manager: NetworkManager state is now CONNECTED_SITE
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4965] device (eth0): Activation: successful, device activated.
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4981] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 26 19:50:20 np0005631999.novalocal NetworkManager[869]: <info>  [1772135420.4990] manager: startup complete
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Reached target NFS client services.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Reached target Remote File Systems.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 26 19:50:20 np0005631999.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 26 Feb 2026 19:50:20 +0000. Up 7.43 seconds.
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |  eth0  | True |         38.102.83.12        | 255.255.255.0 | global | fa:16:3e:b3:08:e6 |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |  eth0  | True | fe80::f816:3eff:feb3:8e6/64 |       .       |  link  | fa:16:3e:b3:08:e6 |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 26 19:50:20 np0005631999.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: new group: name=cloud-user, GID=1001
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: add 'cloud-user' to group 'adm'
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: add 'cloud-user' to group 'systemd-journal'
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: add 'cloud-user' to shadow group 'adm'
Feb 26 19:50:21 np0005631999.novalocal useradd[999]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Generating public/private rsa key pair.
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key fingerprint is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: SHA256:OWWrZD7qasbN5c8HY6H63k2fcI/cUlsfcu3Ub1pBWFE root@np0005631999.novalocal
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key's randomart image is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +---[RSA 3072]----+
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |               oE|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |              o  |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |          o  . . |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |         +..  .  |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |        S...   .o|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |       +oo+  . oB|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |   . o ++. oo ++B|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |    + +..+ o.=.B=|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |   o.oooo.+.. *+o|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Generating public/private ecdsa key pair.
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key fingerprint is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: SHA256:ZtsQ9Qy+Y+7UTmd0xbKrApjyko81v0QfJd2gxDIUqQU root@np0005631999.novalocal
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key's randomart image is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +---[ECDSA 256]---+
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |      Eoo++ .    |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |        =+.* o . |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |       o.o+ = o o|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |      .  . +   o.|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |       oS =   o .|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |    . o+.B + . o |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |     +o o.= o +  |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |    oo.+ o.o +   |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |    .o. o...o    |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Generating public/private ed25519 key pair.
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key fingerprint is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: SHA256:oWPsSXjc8/j96wbKFEDN77eTFVfW4yHOyqvBRi6+xhc root@np0005631999.novalocal
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: The key's randomart image is:
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +--[ED25519 256]--+
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |       ..o      .|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |        . o  . o+|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |        .. .o o.+|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |     + o .. .o o.|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |    . O S..o.   o|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |     = o+E.oo . .|
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |      +..Bo..o + |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |      .ooo+o  =  |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: |      .oo.o .++o |
Feb 26 19:50:22 np0005631999.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Reached target Network is Online.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting System Logging Service...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Permit User Sessions...
Feb 26 19:50:22 np0005631999.novalocal sm-notify[1015]: Version 2.5.4 starting
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Finished Permit User Sessions.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started Command Scheduler.
Feb 26 19:50:22 np0005631999.novalocal sshd[1017]: Server listening on 0.0.0.0 port 22.
Feb 26 19:50:22 np0005631999.novalocal sshd[1017]: Server listening on :: port 22.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started Getty on tty1.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Reached target Login Prompts.
Feb 26 19:50:22 np0005631999.novalocal crond[1020]: (CRON) STARTUP (1.5.7)
Feb 26 19:50:22 np0005631999.novalocal crond[1020]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 26 19:50:22 np0005631999.novalocal crond[1020]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 3% if used.)
Feb 26 19:50:22 np0005631999.novalocal crond[1020]: (CRON) INFO (running with inotify support)
Feb 26 19:50:22 np0005631999.novalocal rsyslogd[1016]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1016" x-info="https://www.rsyslog.com"] start
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Started System Logging Service.
Feb 26 19:50:22 np0005631999.novalocal rsyslogd[1016]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Reached target Multi-User System.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 26 19:50:22 np0005631999.novalocal rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 19:50:22 np0005631999.novalocal kdumpctl[1025]: kdump: No kdump initial ramdisk found.
Feb 26 19:50:22 np0005631999.novalocal kdumpctl[1025]: kdump: Rebuilding /boot/initramfs-5.14.0-686.el9.x86_64kdump.img
Feb 26 19:50:22 np0005631999.novalocal cloud-init[1164]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 26 Feb 2026 19:50:22 +0000. Up 9.17 seconds.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 26 19:50:22 np0005631999.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 26 19:50:22 np0005631999.novalocal cloud-init[1384]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 26 Feb 2026 19:50:22 +0000. Up 9.53 seconds.
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1432]: #############################################################
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1441]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1450]: 256 SHA256:ZtsQ9Qy+Y+7UTmd0xbKrApjyko81v0QfJd2gxDIUqQU root@np0005631999.novalocal (ECDSA)
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1457]: 256 SHA256:oWPsSXjc8/j96wbKFEDN77eTFVfW4yHOyqvBRi6+xhc root@np0005631999.novalocal (ED25519)
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1461]: 3072 SHA256:OWWrZD7qasbN5c8HY6H63k2fcI/cUlsfcu3Ub1pBWFE root@np0005631999.novalocal (RSA)
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1464]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1466]: #############################################################
Feb 26 19:50:23 np0005631999.novalocal cloud-init[1384]: Cloud-init v. 24.4-8.el9 finished at Thu, 26 Feb 2026 19:50:23 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.73 seconds
Feb 26 19:50:23 np0005631999.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 26 19:50:23 np0005631999.novalocal systemd[1]: Reached target Cloud-init target.
Feb 26 19:50:23 np0005631999.novalocal dracut[1521]: dracut-057-110.git20260130.el9
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1539]: Unable to negotiate with 38.102.83.114 port 37488: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1543]: Connection reset by 38.102.83.114 port 37490 [preauth]
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-686.el9.x86_64kdump.img 5.14.0-686.el9.x86_64
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1552]: Unable to negotiate with 38.102.83.114 port 37498: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1561]: Unable to negotiate with 38.102.83.114 port 37508: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1537]: Connection closed by 38.102.83.114 port 37482 [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1576]: Connection reset by 38.102.83.114 port 37526 [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1594]: Unable to negotiate with 38.102.83.114 port 37530: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1602]: Unable to negotiate with 38.102.83.114 port 37544: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 26 19:50:23 np0005631999.novalocal sshd-session[1567]: Connection closed by 38.102.83.114 port 37514 [preauth]
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 26 19:50:23 np0005631999.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: memstrack is not available
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: memstrack is not available
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 26 19:50:24 np0005631999.novalocal dracut[1523]: *** Including module: systemd ***
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: fips ***
Feb 26 19:50:25 np0005631999.novalocal chronyd[815]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Feb 26 19:50:25 np0005631999.novalocal chronyd[815]: System clock TAI offset set to 37 seconds
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: systemd-initrd ***
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: i18n ***
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: drm ***
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: prefixdevname ***
Feb 26 19:50:25 np0005631999.novalocal dracut[1523]: *** Including module: kernel-modules ***
Feb 26 19:50:26 np0005631999.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: kernel-modules-extra ***
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: qemu ***
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: fstab-sys ***
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: rootfs-block ***
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: terminfo ***
Feb 26 19:50:26 np0005631999.novalocal dracut[1523]: *** Including module: udev-rules ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: Skipping udev rule: 91-permissions.rules
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: virtiofs ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: dracut-systemd ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: usrmount ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: base ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: fs-lib ***
Feb 26 19:50:27 np0005631999.novalocal dracut[1523]: *** Including module: kdumpbase ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:   microcode_ctl module: mangling fw_dir
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Including module: openssl ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Including module: shutdown ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Including module: squash ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Including modules done ***
Feb 26 19:50:28 np0005631999.novalocal dracut[1523]: *** Installing kernel module dependencies ***
Feb 26 19:50:29 np0005631999.novalocal dracut[1523]: *** Installing kernel module dependencies done ***
Feb 26 19:50:29 np0005631999.novalocal dracut[1523]: *** Resolving executable dependencies ***
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 25 affinity is now unmanaged
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 31 affinity is now unmanaged
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 28 affinity is now unmanaged
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 32 affinity is now unmanaged
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 30 affinity is now unmanaged
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 26 19:50:29 np0005631999.novalocal irqbalance[809]: IRQ 29 affinity is now unmanaged
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: *** Resolving executable dependencies done ***
Feb 26 19:50:30 np0005631999.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: *** Generating early-microcode cpio image ***
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: *** Store current command line parameters ***
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: Stored kernel commandline:
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: No dracut internal kernel commandline stored in the initramfs
Feb 26 19:50:30 np0005631999.novalocal dracut[1523]: *** Install squash loader ***
Feb 26 19:50:31 np0005631999.novalocal dracut[1523]: *** Squashing the files inside the initramfs ***
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: *** Squashing the files inside the initramfs done ***
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: *** Creating image file '/boot/initramfs-5.14.0-686.el9.x86_64kdump.img' ***
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: *** Hardlinking files ***
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Mode:           real
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Files:          50
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Linked:         0 files
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Compared:       0 xattrs
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Compared:       0 files
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Saved:          0 B
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: Duration:       0.000724 seconds
Feb 26 19:50:32 np0005631999.novalocal dracut[1523]: *** Hardlinking files done ***
Feb 26 19:50:33 np0005631999.novalocal dracut[1523]: *** Creating initramfs image file '/boot/initramfs-5.14.0-686.el9.x86_64kdump.img' done ***
Feb 26 19:50:33 np0005631999.novalocal kdumpctl[1025]: kdump: kexec: loaded kdump kernel
Feb 26 19:50:33 np0005631999.novalocal kdumpctl[1025]: kdump: Starting kdump: [OK]
Feb 26 19:50:33 np0005631999.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 26 19:50:33 np0005631999.novalocal systemd[1]: Startup finished in 1.534s (kernel) + 2.531s (initrd) + 16.275s (userspace) = 20.341s.
Feb 26 19:50:45 np0005631999.novalocal sshd-session[4791]: Accepted publickey for zuul from 38.102.83.114 port 37216 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 26 19:50:45 np0005631999.novalocal systemd-logind[825]: New session 1 of user zuul.
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Queued start job for default target Main User Target.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Created slice User Application Slice.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Started Daily Cleanup of User's Temporary Directories.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Reached target Paths.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Reached target Timers.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Starting D-Bus User Message Bus Socket...
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Starting Create User's Volatile Files and Directories...
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Finished Create User's Volatile Files and Directories.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Listening on D-Bus User Message Bus Socket.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Reached target Sockets.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Reached target Basic System.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Reached target Main User Target.
Feb 26 19:50:45 np0005631999.novalocal systemd[4795]: Startup finished in 132ms.
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 26 19:50:45 np0005631999.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 26 19:50:45 np0005631999.novalocal sshd-session[4791]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 19:50:45 np0005631999.novalocal python3[4877]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 19:50:48 np0005631999.novalocal python3[4905]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 19:50:50 np0005631999.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 26 19:50:54 np0005631999.novalocal python3[4965]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 19:50:54 np0005631999.novalocal python3[5005]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 26 19:50:56 np0005631999.novalocal python3[5031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRNKWvdG8yPZXSoFLQXDW89T4fvo5ST6Cd3jTVPoaeGGxsyesv8+kMnG8+o1TD3S4h2kZ0gDQm3Wmk+48rQ6cVf9a3JoH92az+XeNPoVjcEwPmLGpjmh6B9/yshc8+7vIcbNoOOevoJVS0l/GZlfmapvJ6YAj0rT2i9CQCatOoKkVpKQpwWLwRyaTINOhgY9YHbXw6oTD1wqAxXhm2WsIg1sli8addrD0Qa6rx4wT05aw30u6MqeViLVbu9MWtt+9TV/5QHCW2lwbSuSdekWibgT7RTT2Rs5vu+yDjFam+NUSlQpbFAv+oOnApuiIq0lIGUwGZLHjsNsA+FXC8o0vdfWrVqh9GTdbCPvuPVXWaCr2g367Kom/BDSLwSPDIb3jGUHG7Pcio5D+L0UrlULb3qX7s5/N6rIHgYfzAuDIOY4kA6re5wxQP7EI/CUloD5u1CWco78I4PBElaSA4AFOZPhMX6XSb1M/adOPRCZWadlBTG7jKGzuptwLsgAE88B0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:50:57 np0005631999.novalocal python3[5055]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:50:57 np0005631999.novalocal python3[5154]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:50:58 np0005631999.novalocal python3[5225]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772135457.5410807-207-265805878400109/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=5f415607bf8f48e285d7d787aaeef042_id_rsa follow=False checksum=431facf2d503f98c2d81f7e30700ad1b9efd3b29 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:50:58 np0005631999.novalocal python3[5348]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:50:59 np0005631999.novalocal python3[5419]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772135458.4555147-240-160665425885550/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=5f415607bf8f48e285d7d787aaeef042_id_rsa.pub follow=False checksum=8974bad97cc32beb8c5f8380f50ccbfe6dc5a296 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:00 np0005631999.novalocal python3[5467]: ansible-ping Invoked with data=pong
Feb 26 19:51:01 np0005631999.novalocal python3[5491]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 19:51:04 np0005631999.novalocal python3[5549]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 26 19:51:05 np0005631999.novalocal python3[5581]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:05 np0005631999.novalocal python3[5605]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:05 np0005631999.novalocal python3[5629]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:05 np0005631999.novalocal python3[5653]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:06 np0005631999.novalocal python3[5677]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:06 np0005631999.novalocal python3[5701]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:07 np0005631999.novalocal sudo[5725]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfgbizmkzghmasbzapbkkgluodktnddt ; /usr/bin/python3'
Feb 26 19:51:07 np0005631999.novalocal sudo[5725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:08 np0005631999.novalocal python3[5727]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:08 np0005631999.novalocal sudo[5725]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:08 np0005631999.novalocal sudo[5803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqgmbgfceursinuuwwietzmgifwhapsl ; /usr/bin/python3'
Feb 26 19:51:08 np0005631999.novalocal sudo[5803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:08 np0005631999.novalocal python3[5805]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:51:08 np0005631999.novalocal sudo[5803]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:08 np0005631999.novalocal sudo[5876]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flmcswkdgiptpvodkfffoxbiybdijyko ; /usr/bin/python3'
Feb 26 19:51:08 np0005631999.novalocal sudo[5876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:09 np0005631999.novalocal python3[5878]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772135468.1615357-21-262948720440103/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:09 np0005631999.novalocal sudo[5876]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:09 np0005631999.novalocal python3[5926]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:09 np0005631999.novalocal python3[5950]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:10 np0005631999.novalocal python3[5974]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:10 np0005631999.novalocal python3[5998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:10 np0005631999.novalocal python3[6022]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:10 np0005631999.novalocal python3[6046]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:11 np0005631999.novalocal python3[6070]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:11 np0005631999.novalocal python3[6094]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:11 np0005631999.novalocal python3[6118]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:12 np0005631999.novalocal python3[6142]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:12 np0005631999.novalocal python3[6166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:12 np0005631999.novalocal python3[6190]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:12 np0005631999.novalocal python3[6214]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:13 np0005631999.novalocal python3[6238]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:13 np0005631999.novalocal python3[6262]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:13 np0005631999.novalocal python3[6286]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:13 np0005631999.novalocal python3[6310]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:14 np0005631999.novalocal python3[6334]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:14 np0005631999.novalocal python3[6358]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:14 np0005631999.novalocal python3[6382]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:15 np0005631999.novalocal python3[6406]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:15 np0005631999.novalocal python3[6430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:15 np0005631999.novalocal python3[6454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:15 np0005631999.novalocal python3[6478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:16 np0005631999.novalocal python3[6502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:16 np0005631999.novalocal python3[6526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 19:51:19 np0005631999.novalocal sudo[6550]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyjwewjvqnxkmgvateqdavoqkumvrcly ; /usr/bin/python3'
Feb 26 19:51:19 np0005631999.novalocal sudo[6550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:19 np0005631999.novalocal python3[6552]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 26 19:51:19 np0005631999.novalocal systemd[1]: Starting Time & Date Service...
Feb 26 19:51:19 np0005631999.novalocal systemd[1]: Started Time & Date Service.
Feb 26 19:51:19 np0005631999.novalocal systemd-timedated[6554]: Changed time zone to 'UTC' (UTC).
Feb 26 19:51:19 np0005631999.novalocal sudo[6550]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:19 np0005631999.novalocal sudo[6581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cordtudvcyjdvlhiuoctihbsdzjjwash ; /usr/bin/python3'
Feb 26 19:51:19 np0005631999.novalocal sudo[6581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:19 np0005631999.novalocal python3[6583]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:19 np0005631999.novalocal sudo[6581]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:20 np0005631999.novalocal python3[6659]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:51:20 np0005631999.novalocal python3[6730]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1772135480.0130072-153-78854863814986/source _original_basename=tmpf6wxmcun follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:21 np0005631999.novalocal python3[6830]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:51:21 np0005631999.novalocal python3[6901]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1772135480.9553905-183-95224160671606/source _original_basename=tmp0ono1u_o follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:22 np0005631999.novalocal sudo[7001]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sneaxzlxaqgsaafccssttmijfafpbtee ; /usr/bin/python3'
Feb 26 19:51:22 np0005631999.novalocal sudo[7001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:22 np0005631999.novalocal python3[7003]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:51:22 np0005631999.novalocal sudo[7001]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:22 np0005631999.novalocal sudo[7074]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlqpuwpnxrldxohwzfrqfpxegyupvuwj ; /usr/bin/python3'
Feb 26 19:51:22 np0005631999.novalocal sudo[7074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:22 np0005631999.novalocal python3[7076]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1772135482.1280723-231-6981498205192/source _original_basename=tmpxedrdrs5 follow=False checksum=bee07c3642df91d0fc882b8d0517473ba529622f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:22 np0005631999.novalocal sudo[7074]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:23 np0005631999.novalocal python3[7124]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 19:51:23 np0005631999.novalocal python3[7150]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 19:51:23 np0005631999.novalocal sudo[7228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvwvmrejuduadnkcuejfpnjhuggzhkxx ; /usr/bin/python3'
Feb 26 19:51:23 np0005631999.novalocal sudo[7228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:24 np0005631999.novalocal python3[7230]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:51:24 np0005631999.novalocal sudo[7228]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:24 np0005631999.novalocal sudo[7301]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-espavnhztfzhiocpdjignrbdubpmzppw ; /usr/bin/python3'
Feb 26 19:51:24 np0005631999.novalocal sudo[7301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:24 np0005631999.novalocal python3[7303]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1772135483.7785473-273-240681291698608/source _original_basename=tmpot3nzke8 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:24 np0005631999.novalocal sudo[7301]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:24 np0005631999.novalocal sudo[7352]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwcavbyuptyxitxrqxubssjwsupnmcvr ; /usr/bin/python3'
Feb 26 19:51:24 np0005631999.novalocal sudo[7352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:25 np0005631999.novalocal python3[7354]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-590e-5fdf-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 19:51:25 np0005631999.novalocal sudo[7352]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:25 np0005631999.novalocal python3[7382]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-590e-5fdf-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 26 19:51:26 np0005631999.novalocal python3[7411]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:43 np0005631999.novalocal sudo[7435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxzdhrgolalqvurrimmewveyzrnstwpt ; /usr/bin/python3'
Feb 26 19:51:43 np0005631999.novalocal sudo[7435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:51:43 np0005631999.novalocal python3[7437]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:51:43 np0005631999.novalocal sudo[7435]: pam_unix(sudo:session): session closed for user root
Feb 26 19:51:49 np0005631999.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 26 19:52:15 np0005631999.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 26 19:52:15 np0005631999.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9531] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 26 19:52:15 np0005631999.novalocal systemd-udevd[7440]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9846] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9875] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9879] device (eth1): carrier: link connected
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9881] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9887] policy: auto-activating connection 'Wired connection 1' (6899c43b-9685-3591-bd57-0b0b5009002c)
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9892] device (eth1): Activation: starting connection 'Wired connection 1' (6899c43b-9685-3591-bd57-0b0b5009002c)
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9893] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9898] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9902] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 19:52:15 np0005631999.novalocal NetworkManager[869]: <info>  [1772135535.9907] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:52:16 np0005631999.novalocal python3[7467]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-bfef-2378-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 19:52:23 np0005631999.novalocal sudo[7545]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehaenyiogalqyqruyfsthznhabbeshmk ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 26 19:52:23 np0005631999.novalocal sudo[7545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:52:23 np0005631999.novalocal python3[7547]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:52:23 np0005631999.novalocal sudo[7545]: pam_unix(sudo:session): session closed for user root
Feb 26 19:52:23 np0005631999.novalocal sudo[7618]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcozzcrboyjafmbwajwupyczuescwzrb ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 26 19:52:23 np0005631999.novalocal sudo[7618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:52:23 np0005631999.novalocal python3[7620]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772135543.242035-102-213561096150056/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=a1c6d9c8859a801d492af15be0b7be695e8d43b6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:52:23 np0005631999.novalocal sudo[7618]: pam_unix(sudo:session): session closed for user root
Feb 26 19:52:24 np0005631999.novalocal sudo[7668]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmxlbonzvebmffhsvceyboflgjiqlcnq ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 26 19:52:24 np0005631999.novalocal sudo[7668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:52:24 np0005631999.novalocal python3[7670]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Stopping Network Manager...
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7222] caught SIGTERM, shutting down normally.
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7238] dhcp4 (eth0): canceled DHCP transaction
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7239] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7239] dhcp4 (eth0): state changed no lease
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7245] manager: NetworkManager state is now CONNECTING
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7420] dhcp4 (eth1): canceled DHCP transaction
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7420] dhcp4 (eth1): state changed no lease
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[869]: <info>  [1772135544.7473] exiting (success)
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Stopped Network Manager.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Starting Network Manager...
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8117] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:f3703019-5d3d-46b0-a4dd-d38ee7f05396)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8120] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8177] manager[0x55b70f8be000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Starting Hostname Service...
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Started Hostname Service.
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8790] hostname: hostname: using hostnamed
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8791] hostname: static hostname changed from (none) to "np0005631999.novalocal"
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8796] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8801] manager[0x55b70f8be000]: rfkill: Wi-Fi hardware radio set enabled
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8801] manager[0x55b70f8be000]: rfkill: WWAN hardware radio set enabled
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8828] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8829] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8829] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8830] manager: Networking is enabled by state file
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8833] settings: Loaded settings plugin: keyfile (internal)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8837] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8865] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8873] dhcp: init: Using DHCP client 'internal'
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8875] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8879] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8884] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8890] device (lo): Activation: starting connection 'lo' (dd0b54dd-8e74-4c4e-991a-59513cc199d2)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8896] device (eth0): carrier: link connected
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8899] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8903] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8904] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8909] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8914] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8920] device (eth1): carrier: link connected
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8922] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8927] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (6899c43b-9685-3591-bd57-0b0b5009002c) (indicated)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8928] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8932] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8938] device (eth1): Activation: starting connection 'Wired connection 1' (6899c43b-9685-3591-bd57-0b0b5009002c)
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Started Network Manager.
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8943] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8946] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8948] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8949] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8951] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8971] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8976] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8981] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.8999] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9006] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9009] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9015] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9017] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9031] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9032] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9035] device (lo): Activation: successful, device activated.
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9044] dhcp4 (eth0): state changed new lease, address=38.102.83.12
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9048] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 26 19:52:24 np0005631999.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9126] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9151] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9152] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9155] manager: NetworkManager state is now CONNECTED_SITE
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9158] device (eth0): Activation: successful, device activated.
Feb 26 19:52:24 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135544.9162] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 26 19:52:24 np0005631999.novalocal sudo[7668]: pam_unix(sudo:session): session closed for user root
Feb 26 19:52:25 np0005631999.novalocal python3[7755]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-bfef-2378-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 19:52:34 np0005631999.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 19:52:54 np0005631999.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.3867] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 26 19:53:10 np0005631999.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 19:53:10 np0005631999.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4237] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4240] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4246] device (eth1): Activation: successful, device activated.
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4253] manager: startup complete
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4254] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <warn>  [1772135590.4259] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4268] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4390] dhcp4 (eth1): canceled DHCP transaction
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4390] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4391] dhcp4 (eth1): state changed no lease
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4410] policy: auto-activating connection 'ci-private-network' (b726c5e7-e898-5b0a-8d8d-cdda95de2c7d)
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4418] device (eth1): Activation: starting connection 'ci-private-network' (b726c5e7-e898-5b0a-8d8d-cdda95de2c7d)
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4419] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4424] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4441] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4451] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4490] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4492] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 19:53:10 np0005631999.novalocal NetworkManager[7682]: <info>  [1772135590.4497] device (eth1): Activation: successful, device activated.
Feb 26 19:53:20 np0005631999.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 19:53:23 np0005631999.novalocal sudo[7859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkmorfspvucxyyvwhnnjghlhbkjqbltf ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 26 19:53:23 np0005631999.novalocal sudo[7859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:53:23 np0005631999.novalocal python3[7861]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 19:53:23 np0005631999.novalocal sudo[7859]: pam_unix(sudo:session): session closed for user root
Feb 26 19:53:23 np0005631999.novalocal sudo[7932]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmmttqmwznwkhkjwndepgslruyidimu ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 26 19:53:23 np0005631999.novalocal sudo[7932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 19:53:23 np0005631999.novalocal python3[7934]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772135602.9467223-259-56394143888289/source _original_basename=tmpbw6hryqu follow=False checksum=169b38e45fa720becc2788b00f42cff6d46ae64e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 19:53:23 np0005631999.novalocal sudo[7932]: pam_unix(sudo:session): session closed for user root
Feb 26 19:53:41 np0005631999.novalocal systemd[4795]: Starting Mark boot as successful...
Feb 26 19:53:41 np0005631999.novalocal systemd[4795]: Finished Mark boot as successful.
Feb 26 19:54:23 np0005631999.novalocal sshd-session[4804]: Received disconnect from 38.102.83.114 port 37216:11: disconnected by user
Feb 26 19:54:23 np0005631999.novalocal sshd-session[4804]: Disconnected from user zuul 38.102.83.114 port 37216
Feb 26 19:54:23 np0005631999.novalocal sshd-session[4791]: pam_unix(sshd:session): session closed for user zuul
Feb 26 19:54:23 np0005631999.novalocal systemd-logind[825]: Session 1 logged out. Waiting for processes to exit.
Feb 26 19:56:41 np0005631999.novalocal systemd[4795]: Created slice User Background Tasks Slice.
Feb 26 19:56:41 np0005631999.novalocal systemd[4795]: Starting Cleanup of User's Temporary Files and Directories...
Feb 26 19:56:41 np0005631999.novalocal systemd[4795]: Finished Cleanup of User's Temporary Files and Directories.
Feb 26 20:00:30 np0005631999.novalocal sshd-session[7965]: Accepted publickey for zuul from 38.102.83.114 port 37186 ssh2: RSA SHA256:BS1OYHxsj3pL7fTrw735iYV9x+N9OV3ZWOsuuIBKp/Q
Feb 26 20:00:30 np0005631999.novalocal systemd-logind[825]: New session 3 of user zuul.
Feb 26 20:00:30 np0005631999.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 26 20:00:30 np0005631999.novalocal sshd-session[7965]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:00:30 np0005631999.novalocal sudo[7992]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbjnxargcdkdnubynksoskxwwlxdcyjh ; /usr/bin/python3'
Feb 26 20:00:30 np0005631999.novalocal sudo[7992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:30 np0005631999.novalocal python3[7994]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-f366-f405-0000000021b2-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:30 np0005631999.novalocal sudo[7992]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:30 np0005631999.novalocal sudo[8021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otinnqmbzeanxsekwezvymybzrqygxmw ; /usr/bin/python3'
Feb 26 20:00:30 np0005631999.novalocal sudo[8021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:30 np0005631999.novalocal python3[8023]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:31 np0005631999.novalocal sudo[8021]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:31 np0005631999.novalocal sudo[8047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvmaxzdowkguvothdpwwvsnfuuivjaap ; /usr/bin/python3'
Feb 26 20:00:31 np0005631999.novalocal sudo[8047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:31 np0005631999.novalocal python3[8049]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:31 np0005631999.novalocal sudo[8047]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:31 np0005631999.novalocal sudo[8073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-binhybbouearmusgwanvtmzxvalukhko ; /usr/bin/python3'
Feb 26 20:00:31 np0005631999.novalocal sudo[8073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:31 np0005631999.novalocal python3[8075]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:31 np0005631999.novalocal sudo[8073]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:31 np0005631999.novalocal sudo[8099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldkreyoxouymtwesmfrizvasfliqqbee ; /usr/bin/python3'
Feb 26 20:00:31 np0005631999.novalocal sudo[8099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:31 np0005631999.novalocal python3[8101]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:31 np0005631999.novalocal sudo[8099]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:32 np0005631999.novalocal sudo[8125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdzmthmkvqamjvknqgphmemjdkqzyehs ; /usr/bin/python3'
Feb 26 20:00:32 np0005631999.novalocal sudo[8125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:32 np0005631999.novalocal python3[8127]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:32 np0005631999.novalocal sudo[8125]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:32 np0005631999.novalocal sudo[8203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrpvnibrzekggsqolvixmrvvfhqrwvtj ; /usr/bin/python3'
Feb 26 20:00:32 np0005631999.novalocal sudo[8203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:32 np0005631999.novalocal python3[8205]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:00:32 np0005631999.novalocal sudo[8203]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:32 np0005631999.novalocal sudo[8276]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcbhrkhbtnavbycgokdynbyjwtudcxbr ; /usr/bin/python3'
Feb 26 20:00:32 np0005631999.novalocal sudo[8276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:32 np0005631999.novalocal python3[8278]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772136032.4029171-514-273093188612614/source _original_basename=tmp2mzxf_uc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:00:32 np0005631999.novalocal sudo[8276]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:34 np0005631999.novalocal sudo[8326]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxidhzevhfcyzjgfqlcsgwzyzehwbafj ; /usr/bin/python3'
Feb 26 20:00:34 np0005631999.novalocal sudo[8326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:35 np0005631999.novalocal python3[8328]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:00:35 np0005631999.novalocal systemd[1]: Reloading.
Feb 26 20:00:35 np0005631999.novalocal systemd-rc-local-generator[8351]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:00:35 np0005631999.novalocal sudo[8326]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:35 np0005631999.novalocal sudo[8389]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxqtssytcgcdgpzfkoshyswdecmjvzsa ; /usr/bin/python3'
Feb 26 20:00:35 np0005631999.novalocal sudo[8389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:35 np0005631999.novalocal python3[8391]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 26 20:00:35 np0005631999.novalocal sudo[8389]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:35 np0005631999.novalocal sudo[8415]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geyqycihkuvykckoanrhtjpsxwmdoujj ; /usr/bin/python3'
Feb 26 20:00:35 np0005631999.novalocal sudo[8415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:36 np0005631999.novalocal python3[8417]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:36 np0005631999.novalocal sudo[8415]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:36 np0005631999.novalocal sudo[8443]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-semozxlqlkukrigoezjibpiscxkhajew ; /usr/bin/python3'
Feb 26 20:00:36 np0005631999.novalocal sudo[8443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:36 np0005631999.novalocal python3[8445]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:36 np0005631999.novalocal sudo[8443]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:36 np0005631999.novalocal sudo[8471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aorwwlumjxvqbkffiyhnecoebyysmgdj ; /usr/bin/python3'
Feb 26 20:00:36 np0005631999.novalocal sudo[8471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:36 np0005631999.novalocal python3[8473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:36 np0005631999.novalocal sudo[8471]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:36 np0005631999.novalocal sudo[8499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxvsccrchypqbvwkjmfesktcoybmgka ; /usr/bin/python3'
Feb 26 20:00:36 np0005631999.novalocal sudo[8499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:36 np0005631999.novalocal python3[8501]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:36 np0005631999.novalocal sudo[8499]: pam_unix(sudo:session): session closed for user root
Feb 26 20:00:37 np0005631999.novalocal python3[8528]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-f366-f405-0000000021b9-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:00:37 np0005631999.novalocal python3[8558]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 26 20:00:39 np0005631999.novalocal sshd-session[7968]: Connection closed by 38.102.83.114 port 37186
Feb 26 20:00:39 np0005631999.novalocal sshd-session[7965]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:00:39 np0005631999.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 26 20:00:39 np0005631999.novalocal systemd[1]: session-3.scope: Consumed 3.935s CPU time.
Feb 26 20:00:39 np0005631999.novalocal systemd-logind[825]: Session 3 logged out. Waiting for processes to exit.
Feb 26 20:00:39 np0005631999.novalocal systemd-logind[825]: Removed session 3.
Feb 26 20:00:41 np0005631999.novalocal sshd-session[8564]: Accepted publickey for zuul from 38.102.83.114 port 56072 ssh2: RSA SHA256:BS1OYHxsj3pL7fTrw735iYV9x+N9OV3ZWOsuuIBKp/Q
Feb 26 20:00:41 np0005631999.novalocal systemd-logind[825]: New session 4 of user zuul.
Feb 26 20:00:41 np0005631999.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 26 20:00:41 np0005631999.novalocal sshd-session[8564]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:00:41 np0005631999.novalocal sudo[8591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzvgppavubdvizxpluebscrfmbgopjez ; /usr/bin/python3'
Feb 26 20:00:41 np0005631999.novalocal sudo[8591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:00:41 np0005631999.novalocal python3[8593]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 26 20:00:47 np0005631999.novalocal setsebool[8628]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 26 20:00:47 np0005631999.novalocal setsebool[8628]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:00:58 np0005631999.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:01:01 np0005631999.novalocal CROND[8656]: (root) CMD (run-parts /etc/cron.hourly)
Feb 26 20:01:01 np0005631999.novalocal run-parts[8659]: (/etc/cron.hourly) starting 0anacron
Feb 26 20:01:01 np0005631999.novalocal anacron[8667]: Anacron started on 2026-02-26
Feb 26 20:01:01 np0005631999.novalocal anacron[8667]: Will run job `cron.daily' in 37 min.
Feb 26 20:01:01 np0005631999.novalocal anacron[8667]: Will run job `cron.weekly' in 57 min.
Feb 26 20:01:01 np0005631999.novalocal anacron[8667]: Will run job `cron.monthly' in 77 min.
Feb 26 20:01:01 np0005631999.novalocal anacron[8667]: Jobs will be executed sequentially
Feb 26 20:01:01 np0005631999.novalocal run-parts[8669]: (/etc/cron.hourly) finished 0anacron
Feb 26 20:01:01 np0005631999.novalocal CROND[8655]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  Converting 389 SID table entries...
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:01:07 np0005631999.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:01:26 np0005631999.novalocal dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 26 20:01:26 np0005631999.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:01:26 np0005631999.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:01:26 np0005631999.novalocal systemd[1]: Reloading.
Feb 26 20:01:26 np0005631999.novalocal systemd-rc-local-generator[9438]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:01:26 np0005631999.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:01:27 np0005631999.novalocal sudo[8591]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:30 np0005631999.novalocal python3[13257]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-92a7-7bbb-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:01:31 np0005631999.novalocal kernel: evm: overlay not supported
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: Starting D-Bus User Message Bus...
Feb 26 20:01:31 np0005631999.novalocal dbus-broker-launch[14256]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 26 20:01:31 np0005631999.novalocal dbus-broker-launch[14256]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: Started D-Bus User Message Bus.
Feb 26 20:01:31 np0005631999.novalocal dbus-broker-lau[14256]: Ready
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: Created slice Slice /user.
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: podman-14096.scope: unit configures an IP firewall, but not running as root.
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: (This warning is only shown for the first unit using IP firewalling.)
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: Started podman-14096.scope.
Feb 26 20:01:31 np0005631999.novalocal systemd[4795]: Started podman-pause-f22b7bd5.scope.
Feb 26 20:01:32 np0005631999.novalocal sudo[14561]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bouzmzbpiphmfuvdebotvpqhhapmpxse ; /usr/bin/python3'
Feb 26 20:01:32 np0005631999.novalocal sudo[14561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:32 np0005631999.novalocal python3[14576]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.107:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.107:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:01:32 np0005631999.novalocal python3[14576]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 26 20:01:32 np0005631999.novalocal sudo[14561]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:32 np0005631999.novalocal sshd-session[8567]: Connection closed by 38.102.83.114 port 56072
Feb 26 20:01:32 np0005631999.novalocal sshd-session[8564]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:01:32 np0005631999.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 26 20:01:32 np0005631999.novalocal systemd[1]: session-4.scope: Consumed 42.335s CPU time.
Feb 26 20:01:32 np0005631999.novalocal systemd-logind[825]: Session 4 logged out. Waiting for processes to exit.
Feb 26 20:01:32 np0005631999.novalocal systemd-logind[825]: Removed session 4.
Feb 26 20:01:49 np0005631999.novalocal sshd-session[23303]: Unable to negotiate with 38.102.83.18 port 33100: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 26 20:01:49 np0005631999.novalocal sshd-session[23304]: Connection closed by 38.102.83.18 port 33072 [preauth]
Feb 26 20:01:49 np0005631999.novalocal sshd-session[23309]: Connection closed by 38.102.83.18 port 33084 [preauth]
Feb 26 20:01:49 np0005631999.novalocal sshd-session[23307]: Unable to negotiate with 38.102.83.18 port 33096: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 26 20:01:49 np0005631999.novalocal sshd-session[23310]: Unable to negotiate with 38.102.83.18 port 33114: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 26 20:01:53 np0005631999.novalocal sshd-session[25314]: Accepted publickey for zuul from 38.102.83.114 port 36282 ssh2: RSA SHA256:BS1OYHxsj3pL7fTrw735iYV9x+N9OV3ZWOsuuIBKp/Q
Feb 26 20:01:53 np0005631999.novalocal systemd-logind[825]: New session 5 of user zuul.
Feb 26 20:01:53 np0005631999.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 26 20:01:53 np0005631999.novalocal sshd-session[25314]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:01:53 np0005631999.novalocal python3[25440]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF8buLVnu+8VMwvKheGC3BFOZrc52pBzaIZ0FLkR7d+1oAiT9haaGB9EvuGxIKboLhqBLVLePWtbdYSepuUsOfY= zuul@np0005631998.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 20:01:53 np0005631999.novalocal sudo[25640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtupdaecgknxkhpekoyemhddmmkxuevv ; /usr/bin/python3'
Feb 26 20:01:53 np0005631999.novalocal sudo[25640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:54 np0005631999.novalocal python3[25653]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF8buLVnu+8VMwvKheGC3BFOZrc52pBzaIZ0FLkR7d+1oAiT9haaGB9EvuGxIKboLhqBLVLePWtbdYSepuUsOfY= zuul@np0005631998.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 20:01:54 np0005631999.novalocal sudo[25640]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:54 np0005631999.novalocal sudo[26077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-motiarfjgjlfzexyyoxikmcezggelcdd ; /usr/bin/python3'
Feb 26 20:01:54 np0005631999.novalocal sudo[26077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:54 np0005631999.novalocal python3[26088]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005631999.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 26 20:01:54 np0005631999.novalocal useradd[26173]: new group: name=cloud-admin, GID=1002
Feb 26 20:01:54 np0005631999.novalocal useradd[26173]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 26 20:01:54 np0005631999.novalocal sudo[26077]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:55 np0005631999.novalocal sudo[26346]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlvuzvzsepwhjimxrloifbwecfoqxqfq ; /usr/bin/python3'
Feb 26 20:01:55 np0005631999.novalocal sudo[26346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:55 np0005631999.novalocal python3[26356]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF8buLVnu+8VMwvKheGC3BFOZrc52pBzaIZ0FLkR7d+1oAiT9haaGB9EvuGxIKboLhqBLVLePWtbdYSepuUsOfY= zuul@np0005631998.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 26 20:01:55 np0005631999.novalocal sudo[26346]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:55 np0005631999.novalocal sudo[26688]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ercfowtnnonnqghoinbkputjfeebxgfe ; /usr/bin/python3'
Feb 26 20:01:55 np0005631999.novalocal sudo[26688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:55 np0005631999.novalocal python3[26699]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:01:55 np0005631999.novalocal sudo[26688]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:55 np0005631999.novalocal sudo[27031]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjzardzntylrmgmvqkrkjlatzpzjvuze ; /usr/bin/python3'
Feb 26 20:01:55 np0005631999.novalocal sudo[27031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:56 np0005631999.novalocal python3[27042]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772136115.416136-135-235872231511924/source _original_basename=tmpi0jjgdu0 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:01:56 np0005631999.novalocal sudo[27031]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:56 np0005631999.novalocal sudo[27446]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzeonyudxwwneqwcgzimhyvsnyqxlzwz ; /usr/bin/python3'
Feb 26 20:01:56 np0005631999.novalocal sudo[27446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:01:56 np0005631999.novalocal python3[27460]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 26 20:01:56 np0005631999.novalocal systemd[1]: Starting Hostname Service...
Feb 26 20:01:57 np0005631999.novalocal systemd[1]: Started Hostname Service.
Feb 26 20:01:57 np0005631999.novalocal systemd-hostnamed[27588]: Changed pretty hostname to 'compute-0'
Feb 26 20:01:57 compute-0 systemd-hostnamed[27588]: Hostname set to <compute-0> (static)
Feb 26 20:01:57 compute-0 NetworkManager[7682]: <info>  [1772136117.0876] hostname: static hostname changed from "np0005631999.novalocal" to "compute-0"
Feb 26 20:01:57 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 20:01:57 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 20:01:57 compute-0 sudo[27446]: pam_unix(sudo:session): session closed for user root
Feb 26 20:01:57 compute-0 sshd-session[25373]: Connection closed by 38.102.83.114 port 36282
Feb 26 20:01:57 compute-0 sshd-session[25314]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:01:57 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Feb 26 20:01:57 compute-0 systemd[1]: session-5.scope: Consumed 2.151s CPU time.
Feb 26 20:01:57 compute-0 systemd-logind[825]: Session 5 logged out. Waiting for processes to exit.
Feb 26 20:01:57 compute-0 systemd-logind[825]: Removed session 5.
Feb 26 20:02:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:02:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:02:03 compute-0 systemd[1]: man-db-cache-update.service: Consumed 43.507s CPU time.
Feb 26 20:02:03 compute-0 systemd[1]: run-r68951563a2a5429ba6dcd6ed8a74477f.service: Deactivated successfully.
Feb 26 20:02:07 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 20:02:27 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 26 20:05:41 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 26 20:05:41 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 26 20:05:41 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 26 20:05:41 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 26 20:06:13 compute-0 sshd-session[30545]: Accepted publickey for zuul from 38.102.83.18 port 41988 ssh2: RSA SHA256:BS1OYHxsj3pL7fTrw735iYV9x+N9OV3ZWOsuuIBKp/Q
Feb 26 20:06:13 compute-0 systemd-logind[825]: New session 6 of user zuul.
Feb 26 20:06:13 compute-0 systemd[1]: Started Session 6 of User zuul.
Feb 26 20:06:13 compute-0 sshd-session[30545]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:06:14 compute-0 python3[30621]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:06:15 compute-0 sudo[30735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qinljqtgoaurfkuexbmnshizqexnmyet ; /usr/bin/python3'
Feb 26 20:06:15 compute-0 sudo[30735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:15 compute-0 python3[30737]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:15 compute-0 sudo[30735]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:15 compute-0 sudo[30808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwraoqgpmpaertpxtbuszeopxrtbuny ; /usr/bin/python3'
Feb 26 20:06:15 compute-0 sudo[30808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:15 compute-0 python3[30810]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=delorean.repo follow=False checksum=c7624fe5e858d4139de1ac159778eb6fd097c2ca backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:15 compute-0 sudo[30808]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:15 compute-0 sudo[30834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyfgocllzsnzlpvgzrikazeucacdhndm ; /usr/bin/python3'
Feb 26 20:06:15 compute-0 sudo[30834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:16 compute-0 python3[30836]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:16 compute-0 sudo[30834]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:16 compute-0 sudo[30907]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqmzicletcamoiavnnaokffbuessekc ; /usr/bin/python3'
Feb 26 20:06:16 compute-0 sudo[30907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:16 compute-0 python3[30909]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:16 compute-0 sudo[30907]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:16 compute-0 sudo[30933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwmelglpfobziqscncdosszjthvfcdak ; /usr/bin/python3'
Feb 26 20:06:16 compute-0 sudo[30933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:16 compute-0 python3[30935]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:16 compute-0 sudo[30933]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:16 compute-0 sudo[31006]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqxchqvhjjwenibzxksxhdtkvzrxcjpd ; /usr/bin/python3'
Feb 26 20:06:16 compute-0 sudo[31006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:16 compute-0 python3[31008]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:16 compute-0 sudo[31006]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:16 compute-0 sudo[31032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntlvotbidkxdwshfswrmxsfamjwngxw ; /usr/bin/python3'
Feb 26 20:06:16 compute-0 sudo[31032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:17 compute-0 python3[31034]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:17 compute-0 sudo[31032]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:17 compute-0 sudo[31105]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqqumkpcsrtnejxbdaaruikgdddpxno ; /usr/bin/python3'
Feb 26 20:06:17 compute-0 sudo[31105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:17 compute-0 python3[31107]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:17 compute-0 sudo[31105]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:17 compute-0 sudo[31131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlnsyamckebroaugnskzcryvpriavukg ; /usr/bin/python3'
Feb 26 20:06:17 compute-0 sudo[31131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:17 compute-0 python3[31133]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:17 compute-0 sudo[31131]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:17 compute-0 sudo[31204]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wixyxebodrwomyuwytirybokgpbniojc ; /usr/bin/python3'
Feb 26 20:06:17 compute-0 sudo[31204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:17 compute-0 python3[31206]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:17 compute-0 sudo[31204]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:17 compute-0 sudo[31230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtxyvsqxatixzrrieqchoukuoztgmvez ; /usr/bin/python3'
Feb 26 20:06:17 compute-0 sudo[31230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:18 compute-0 python3[31232]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:18 compute-0 sudo[31230]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:18 compute-0 sudo[31303]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gohxmgplovgfbkdkyilhsfeqjexnxbzt ; /usr/bin/python3'
Feb 26 20:06:18 compute-0 sudo[31303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:18 compute-0 python3[31305]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:18 compute-0 sudo[31303]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:18 compute-0 sudo[31329]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfsisrerkmoaxkzftzltjrlizifmahur ; /usr/bin/python3'
Feb 26 20:06:18 compute-0 sudo[31329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:18 compute-0 python3[31331]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 26 20:06:18 compute-0 sudo[31329]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:18 compute-0 sudo[31402]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwtkkwdqotfbozjauxlvdkdrqgndmuue ; /usr/bin/python3'
Feb 26 20:06:18 compute-0 sudo[31402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:06:18 compute-0 python3[31404]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772136375.2514215-34283-210434763293669/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=06a0a916cb7cbc51b08d6616a672f1322305cccf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:06:18 compute-0 sudo[31402]: pam_unix(sudo:session): session closed for user root
Feb 26 20:06:20 compute-0 sshd-session[31430]: Connection closed by 192.168.122.11 port 40472 [preauth]
Feb 26 20:06:20 compute-0 sshd-session[31429]: Connection closed by 192.168.122.11 port 40462 [preauth]
Feb 26 20:06:20 compute-0 sshd-session[31432]: Unable to negotiate with 192.168.122.11 port 40486: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 26 20:06:20 compute-0 sshd-session[31433]: Unable to negotiate with 192.168.122.11 port 40498: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 26 20:06:20 compute-0 sshd-session[31431]: Unable to negotiate with 192.168.122.11 port 40482: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 26 20:07:25 compute-0 python3[31463]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:12:24 compute-0 sshd-session[30548]: Received disconnect from 38.102.83.18 port 41988:11: disconnected by user
Feb 26 20:12:24 compute-0 sshd-session[30548]: Disconnected from user zuul 38.102.83.18 port 41988
Feb 26 20:12:24 compute-0 sshd-session[30545]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:12:24 compute-0 systemd-logind[825]: Session 6 logged out. Waiting for processes to exit.
Feb 26 20:12:24 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 26 20:12:24 compute-0 systemd[1]: session-6.scope: Consumed 4.294s CPU time.
Feb 26 20:12:24 compute-0 systemd-logind[825]: Removed session 6.
Feb 26 20:18:47 compute-0 sshd-session[31469]: Accepted publickey for zuul from 192.168.122.30 port 34792 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:18:47 compute-0 systemd-logind[825]: New session 7 of user zuul.
Feb 26 20:18:47 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 26 20:18:47 compute-0 sshd-session[31469]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:18:48 compute-0 python3.9[31622]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:18:49 compute-0 sudo[31801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnzzkqcyzdzlzejrxyeibqmgqewecepm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137128.8052802-27-13612788047803/AnsiballZ_command.py'
Feb 26 20:18:49 compute-0 sudo[31801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:18:49 compute-0 python3.9[31804]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:18:56 compute-0 sudo[31801]: pam_unix(sudo:session): session closed for user root
Feb 26 20:18:56 compute-0 sshd-session[31472]: Connection closed by 192.168.122.30 port 34792
Feb 26 20:18:56 compute-0 sshd-session[31469]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:18:56 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 26 20:18:56 compute-0 systemd[1]: session-7.scope: Consumed 7.491s CPU time.
Feb 26 20:18:56 compute-0 systemd-logind[825]: Session 7 logged out. Waiting for processes to exit.
Feb 26 20:18:56 compute-0 systemd-logind[825]: Removed session 7.
Feb 26 20:19:02 compute-0 sshd-session[31862]: Accepted publickey for zuul from 192.168.122.30 port 49388 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:19:02 compute-0 systemd-logind[825]: New session 8 of user zuul.
Feb 26 20:19:02 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 26 20:19:02 compute-0 sshd-session[31862]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:19:03 compute-0 python3.9[32015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:19:03 compute-0 sshd-session[31865]: Connection closed by 192.168.122.30 port 49388
Feb 26 20:19:03 compute-0 sshd-session[31862]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:19:03 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 26 20:19:03 compute-0 systemd-logind[825]: Session 8 logged out. Waiting for processes to exit.
Feb 26 20:19:03 compute-0 systemd-logind[825]: Removed session 8.
Feb 26 20:19:19 compute-0 sshd-session[32043]: Accepted publickey for zuul from 192.168.122.30 port 52774 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:19:19 compute-0 systemd-logind[825]: New session 9 of user zuul.
Feb 26 20:19:19 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 26 20:19:19 compute-0 sshd-session[32043]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:19:20 compute-0 python3.9[32196]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 26 20:19:21 compute-0 python3.9[32370]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:19:21 compute-0 sudo[32520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkawspmclrqxylgjpvraxexwtljqshaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137161.4160109-40-122391625758516/AnsiballZ_command.py'
Feb 26 20:19:21 compute-0 sudo[32520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:22 compute-0 python3.9[32523]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:19:22 compute-0 sudo[32520]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:22 compute-0 sudo[32674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwengkmddudomhbrrfzereosaxsyvwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137162.4972715-52-65566304285076/AnsiballZ_stat.py'
Feb 26 20:19:22 compute-0 sudo[32674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:23 compute-0 python3.9[32677]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:19:23 compute-0 sudo[32674]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:23 compute-0 sudo[32827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hagznzhtowjgdukjwswoxshspndgqufv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137163.2155201-60-103213609487591/AnsiballZ_file.py'
Feb 26 20:19:23 compute-0 sudo[32827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:23 compute-0 python3.9[32830]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:19:23 compute-0 sudo[32827]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:24 compute-0 sudo[32980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuhujxawnpwnyzpfuaekhylraegplsoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137163.986298-68-128305562053871/AnsiballZ_stat.py'
Feb 26 20:19:24 compute-0 sudo[32980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:24 compute-0 python3.9[32983]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:19:24 compute-0 sudo[32980]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:24 compute-0 sudo[33104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cliglrqkhpxjbjjdlpirstlasazeckrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137163.986298-68-128305562053871/AnsiballZ_copy.py'
Feb 26 20:19:24 compute-0 sudo[33104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:25 compute-0 python3.9[33107]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137163.986298-68-128305562053871/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:19:25 compute-0 sudo[33104]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:25 compute-0 sudo[33257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyokyeoqllgewbfortyxbpyjlqsjkdrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137165.2884452-83-27329983965933/AnsiballZ_setup.py'
Feb 26 20:19:25 compute-0 sudo[33257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:25 compute-0 python3.9[33260]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:19:26 compute-0 sudo[33257]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:26 compute-0 sudo[33414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuixgrntbycobduanvxqcledgdrfclae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137166.3759336-91-121828443837078/AnsiballZ_file.py'
Feb 26 20:19:26 compute-0 sudo[33414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:26 compute-0 python3.9[33417]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:19:26 compute-0 sudo[33414]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:27 compute-0 sudo[33567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vptytpwitilbrropvmolvmmoiidrurov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137167.0660968-100-56564864234888/AnsiballZ_file.py'
Feb 26 20:19:27 compute-0 sudo[33567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:27 compute-0 python3.9[33570]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:19:27 compute-0 sudo[33567]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:28 compute-0 python3.9[33720]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:19:29 compute-0 irqbalance[809]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 26 20:19:29 compute-0 irqbalance[809]: IRQ 27 affinity is now unmanaged
Feb 26 20:19:31 compute-0 python3.9[33974]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:19:31 compute-0 python3.9[34124]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:19:32 compute-0 python3.9[34278]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:19:33 compute-0 sudo[34434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnoazqgcdwowjwyevljzguyfrkcwibhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137173.294241-148-81203343059899/AnsiballZ_setup.py'
Feb 26 20:19:33 compute-0 sudo[34434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:33 compute-0 python3.9[34437]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:19:34 compute-0 sudo[34434]: pam_unix(sudo:session): session closed for user root
Feb 26 20:19:34 compute-0 sudo[34519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmapcfovbhycuqzapaqnmfmozxryparp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137173.294241-148-81203343059899/AnsiballZ_dnf.py'
Feb 26 20:19:34 compute-0 sudo[34519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:19:35 compute-0 python3.9[34522]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:20:17 compute-0 systemd[1]: Reloading.
Feb 26 20:20:18 compute-0 systemd-rc-local-generator[34718]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:20:18 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 26 20:20:18 compute-0 systemd[1]: Reloading.
Feb 26 20:20:18 compute-0 systemd-rc-local-generator[34766]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:20:18 compute-0 systemd[1]: Starting dnf makecache...
Feb 26 20:20:18 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 26 20:20:18 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 26 20:20:18 compute-0 systemd[1]: Reloading.
Feb 26 20:20:18 compute-0 systemd-rc-local-generator[34814]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:20:18 compute-0 dnf[34788]: Failed determining last makecache time.
Feb 26 20:20:18 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-barbican-42b4c41831408a8e323 137 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-glean-642fffe0203a8ffcc2443db52 158 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-cinder-e95a374f4f00ef02d562d 139 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-stevedore-c4acc5639fd2329372142 153 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:20:19 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-cloudkitty-tests-tempest-ef9563 141 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-diskimage-builder-cbb4478c143869181ba9 115 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-nova-5cfeecbf22fca58822607dd 158 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-designate-tests-tempest-347fdbc 163 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-glance-1fd12c29b339f30fe823e 165 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 162 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-manila-8fa2b5793100022b4d0f6 140 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-whitebox-neutron-tests-tempest- 136 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-octavia-76dfc1e35cf7f4dd6102 134 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-watcher-c014f81a8647287f6dcc 160 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-tcib-b403f1051724db0286e1418f59 150 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 164 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-swift-dc98a8463506ac520c469a 165 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-python-tempestconf-8e33668cda707818ee1 174 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: delorean-openstack-heat-ui-013accbfd179753bc3f0 198 kB/s | 3.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: CentOS Stream 9 - BaseOS                         29 kB/s | 7.0 kB     00:00
Feb 26 20:20:19 compute-0 dnf[34788]: CentOS Stream 9 - AppStream                      69 kB/s | 7.1 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: CentOS Stream 9 - CRB                            70 kB/s | 6.9 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: CentOS Stream 9 - Extras packages                72 kB/s | 7.6 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: dlrn-antelope-testing                           149 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: dlrn-antelope-build-deps                        143 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: centos9-rabbitmq                                112 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: centos9-storage                                 112 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: centos9-opstools                                138 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: NFV SIG OpenvSwitch                             124 kB/s | 3.0 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: repo-setup-centos-appstream                     200 kB/s | 4.4 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: repo-setup-centos-baseos                        149 kB/s | 3.9 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: repo-setup-centos-highavailability              162 kB/s | 3.9 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: repo-setup-centos-powertools                    188 kB/s | 4.3 kB     00:00
Feb 26 20:20:20 compute-0 dnf[34788]: Extra Packages for Enterprise Linux 9 - x86_64  168 kB/s |  30 kB     00:00
Feb 26 20:20:21 compute-0 dnf[34788]: Metadata cache created.
Feb 26 20:20:21 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 26 20:20:21 compute-0 systemd[1]: Finished dnf makecache.
Feb 26 20:20:21 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.753s CPU time.
Feb 26 20:21:16 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:21:16 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:21:16 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 26 20:21:16 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:21:16 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:21:16 compute-0 systemd[1]: Reloading.
Feb 26 20:21:16 compute-0 systemd-rc-local-generator[35190]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:21:16 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:21:17 compute-0 sudo[34519]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:17 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:21:17 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:21:17 compute-0 systemd[1]: run-r7090712d3c3f43678f6b1f168f0d4016.service: Deactivated successfully.
Feb 26 20:21:17 compute-0 sudo[36118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruogtccjbdcxqzjvhhnmngvpbpvhuros ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137277.501856-160-49171811175995/AnsiballZ_command.py'
Feb 26 20:21:17 compute-0 sudo[36118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:17 compute-0 python3.9[36121]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:18 compute-0 sudo[36118]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:19 compute-0 sudo[36400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgiozboandomyyyfqjjvsreobthwyhmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137279.0756629-168-241988015372247/AnsiballZ_selinux.py'
Feb 26 20:21:19 compute-0 sudo[36400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:19 compute-0 python3.9[36403]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 26 20:21:19 compute-0 sudo[36400]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:20 compute-0 sudo[36553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnohfnapyutlvepcsauoffyvhbwrjvhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137280.278294-179-164137645406340/AnsiballZ_command.py'
Feb 26 20:21:20 compute-0 sudo[36553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:20 compute-0 python3.9[36556]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 26 20:21:21 compute-0 sudo[36553]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:21 compute-0 sudo[36707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvhzozxprvdbsjdtfxguaauyaobmqth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137281.5333521-187-23364399501696/AnsiballZ_file.py'
Feb 26 20:21:21 compute-0 sudo[36707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:22 compute-0 python3.9[36710]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:21:22 compute-0 sudo[36707]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:23 compute-0 sudo[36860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiqeidvqpxeoxdcabauxwxsgikpjttvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137282.961186-195-145787944388186/AnsiballZ_mount.py'
Feb 26 20:21:23 compute-0 sudo[36860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:23 compute-0 python3.9[36863]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 26 20:21:23 compute-0 sudo[36860]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:24 compute-0 sudo[37013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdcduewmpdtuwrcntnxufrbxguwkzcwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137284.366285-223-240697025749677/AnsiballZ_file.py'
Feb 26 20:21:24 compute-0 sudo[37013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:24 compute-0 python3.9[37016]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:21:24 compute-0 sudo[37013]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:25 compute-0 sudo[37166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjafjeeumwobfshbblpgzymonvqpfttk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137284.965059-231-232499486431534/AnsiballZ_stat.py'
Feb 26 20:21:25 compute-0 sudo[37166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:25 compute-0 python3.9[37169]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:21:25 compute-0 sudo[37166]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:25 compute-0 sudo[37290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xinrddsxtorcrmbbdflxjnytnzzzejnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137284.965059-231-232499486431534/AnsiballZ_copy.py'
Feb 26 20:21:25 compute-0 sudo[37290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:25 compute-0 python3.9[37293]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137284.965059-231-232499486431534/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:21:25 compute-0 sudo[37290]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:26 compute-0 sudo[37443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcpblutlyvxymynbevocqmfgtmcfrhog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137286.3513064-255-4887974289554/AnsiballZ_stat.py'
Feb 26 20:21:26 compute-0 sudo[37443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:29 compute-0 python3.9[37446]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:21:29 compute-0 sudo[37443]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:30 compute-0 sudo[37596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylewtmfeyzygfjawnjbekjhmdhdixesm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137289.970411-263-254153718330764/AnsiballZ_command.py'
Feb 26 20:21:30 compute-0 sudo[37596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:30 compute-0 python3.9[37599]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:30 compute-0 sudo[37596]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:30 compute-0 sudo[37750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uubrpsawbtfcddgexdbolqhhcgnfavcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137290.648852-271-98764388394004/AnsiballZ_file.py'
Feb 26 20:21:30 compute-0 sudo[37750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:31 compute-0 python3.9[37753]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:21:31 compute-0 sudo[37750]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:31 compute-0 sudo[37903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebtpkyerrcpwmjxjtjsuzkcsoryofcpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137291.4871228-282-31719662473576/AnsiballZ_getent.py'
Feb 26 20:21:31 compute-0 sudo[37903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:32 compute-0 python3.9[37906]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 26 20:21:32 compute-0 sudo[37903]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:32 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:21:32 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:21:32 compute-0 sudo[38058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fevxmjskonmbxqwekwbbzwjlsvomyhrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137292.3359187-290-192747780939964/AnsiballZ_group.py'
Feb 26 20:21:32 compute-0 sudo[38058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:32 compute-0 python3.9[38061]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:21:32 compute-0 groupadd[38062]: group added to /etc/group: name=qemu, GID=107
Feb 26 20:21:32 compute-0 groupadd[38062]: group added to /etc/gshadow: name=qemu
Feb 26 20:21:32 compute-0 groupadd[38062]: new group: name=qemu, GID=107
Feb 26 20:21:32 compute-0 sudo[38058]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:33 compute-0 sudo[38217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bclfcxataohjanekyfyliiqfaathrfdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137293.1097403-298-51393503942109/AnsiballZ_user.py'
Feb 26 20:21:33 compute-0 sudo[38217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:33 compute-0 python3.9[38220]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 26 20:21:33 compute-0 useradd[38222]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/1
Feb 26 20:21:33 compute-0 sudo[38217]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:34 compute-0 sudo[38378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaznsgudgmdperndnvtiaulekglkblwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137294.0450535-306-97921451782155/AnsiballZ_getent.py'
Feb 26 20:21:34 compute-0 sudo[38378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:34 compute-0 python3.9[38381]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 26 20:21:34 compute-0 sudo[38378]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:35 compute-0 sudo[38532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcqigjrguliflvzfcsmccyvonzwyfzwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137294.7561142-314-122899945078247/AnsiballZ_group.py'
Feb 26 20:21:35 compute-0 sudo[38532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:35 compute-0 python3.9[38535]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:21:35 compute-0 groupadd[38536]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 26 20:21:35 compute-0 groupadd[38536]: group added to /etc/gshadow: name=hugetlbfs
Feb 26 20:21:35 compute-0 groupadd[38536]: new group: name=hugetlbfs, GID=42477
Feb 26 20:21:35 compute-0 sudo[38532]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:35 compute-0 sudo[38691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijtavxcdahvhsgrvnkrpwsfakxnbetn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137295.4718792-323-205984866439569/AnsiballZ_file.py'
Feb 26 20:21:35 compute-0 sudo[38691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:35 compute-0 python3.9[38694]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 26 20:21:35 compute-0 sudo[38691]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:36 compute-0 sudo[38844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxhxhuoulmxhakgsahdxkcrfecmcuyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137296.2919672-334-254617874379501/AnsiballZ_dnf.py'
Feb 26 20:21:36 compute-0 sudo[38844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:36 compute-0 python3.9[38847]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:21:38 compute-0 sudo[38844]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:38 compute-0 sudo[38998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvvuybyktpylscahzvcvvvtgflcexjxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137298.4377117-342-104283398803638/AnsiballZ_file.py'
Feb 26 20:21:38 compute-0 sudo[38998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:38 compute-0 python3.9[39001]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:21:38 compute-0 sudo[38998]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:39 compute-0 sudo[39151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jirdhpsccttvoriwcyghyihjffxptbva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137299.0546033-350-167285431297738/AnsiballZ_stat.py'
Feb 26 20:21:39 compute-0 sudo[39151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:39 compute-0 python3.9[39154]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:21:39 compute-0 sudo[39151]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:39 compute-0 sudo[39275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlmbqoyrhxiinylwxomiyuzinsomemgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137299.0546033-350-167285431297738/AnsiballZ_copy.py'
Feb 26 20:21:39 compute-0 sudo[39275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:39 compute-0 python3.9[39278]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137299.0546033-350-167285431297738/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:21:39 compute-0 sudo[39275]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:40 compute-0 sudo[39428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qitpcqgtjqdwrqrtzdefpouztzcuabne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137300.1375222-365-136660087258912/AnsiballZ_systemd.py'
Feb 26 20:21:40 compute-0 sudo[39428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:41 compute-0 python3.9[39431]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:21:41 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 26 20:21:41 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 26 20:21:41 compute-0 kernel: Bridge firewalling registered
Feb 26 20:21:41 compute-0 systemd-modules-load[39435]: Inserted module 'br_netfilter'
Feb 26 20:21:41 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 26 20:21:41 compute-0 sudo[39428]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:41 compute-0 sudo[39588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpovnwgydpfdnbvafxbtmrpdtdbzuwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137301.3149428-373-3604711590910/AnsiballZ_stat.py'
Feb 26 20:21:41 compute-0 sudo[39588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:41 compute-0 python3.9[39591]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:21:41 compute-0 sudo[39588]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:42 compute-0 sudo[39712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipnzxdcnmyjuhznfumvgmchyleonycca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137301.3149428-373-3604711590910/AnsiballZ_copy.py'
Feb 26 20:21:42 compute-0 sudo[39712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:42 compute-0 python3.9[39715]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137301.3149428-373-3604711590910/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:21:42 compute-0 sudo[39712]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:42 compute-0 sudo[39865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hukuuuuslhcochelvjfanhgqdfzexwbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137302.6168413-391-235628386541657/AnsiballZ_dnf.py'
Feb 26 20:21:42 compute-0 sudo[39865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:43 compute-0 python3.9[39868]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:21:45 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:21:46 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:21:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:21:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:21:46 compute-0 systemd[1]: Reloading.
Feb 26 20:21:46 compute-0 systemd-rc-local-generator[39927]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:21:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:21:46 compute-0 sudo[39865]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:47 compute-0 python3.9[41414]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:21:48 compute-0 python3.9[42578]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 26 20:21:48 compute-0 python3.9[43540]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:21:49 compute-0 sudo[44127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imptmgpasnbordqwjxqflbpfklflkxix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137309.1700988-430-263295648373070/AnsiballZ_command.py'
Feb 26 20:21:49 compute-0 sudo[44127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:21:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:21:49 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.521s CPU time.
Feb 26 20:21:49 compute-0 systemd[1]: run-r1ab6b81ce5684d0781be5ea760d3ad51.service: Deactivated successfully.
Feb 26 20:21:49 compute-0 python3.9[44131]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:49 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 26 20:21:50 compute-0 systemd[1]: Starting Authorization Manager...
Feb 26 20:21:50 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 26 20:21:50 compute-0 polkitd[44348]: Started polkitd version 0.117
Feb 26 20:21:50 compute-0 polkitd[44348]: Loading rules from directory /etc/polkit-1/rules.d
Feb 26 20:21:50 compute-0 polkitd[44348]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 26 20:21:50 compute-0 polkitd[44348]: Finished loading, compiling and executing 2 rules
Feb 26 20:21:50 compute-0 systemd[1]: Started Authorization Manager.
Feb 26 20:21:50 compute-0 polkitd[44348]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 26 20:21:50 compute-0 sudo[44127]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:50 compute-0 sudo[44516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eayqrrbczbckegisbcvuskrnbwavtoqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137310.5645437-439-30717393884374/AnsiballZ_systemd.py'
Feb 26 20:21:50 compute-0 sudo[44516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:51 compute-0 python3.9[44519]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:21:51 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 26 20:21:51 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 26 20:21:51 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 26 20:21:51 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 26 20:21:51 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 26 20:21:51 compute-0 sudo[44516]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:52 compute-0 python3.9[44680]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 26 20:21:53 compute-0 sudo[44830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahzukfuuyokvieqoifieasbyqyofisql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137313.4902208-496-155123761555929/AnsiballZ_systemd.py'
Feb 26 20:21:53 compute-0 sudo[44830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:54 compute-0 python3.9[44833]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:21:54 compute-0 systemd[1]: Reloading.
Feb 26 20:21:54 compute-0 systemd-rc-local-generator[44862]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:21:54 compute-0 sudo[44830]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:54 compute-0 sudo[45027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqwiqvhxzczjocqmdqxhjygvaypoaful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137314.557972-496-99219374336975/AnsiballZ_systemd.py'
Feb 26 20:21:54 compute-0 sudo[45027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:55 compute-0 python3.9[45030]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:21:55 compute-0 systemd[1]: Reloading.
Feb 26 20:21:55 compute-0 systemd-rc-local-generator[45057]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:21:55 compute-0 sudo[45027]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:55 compute-0 sudo[45223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcjncjuacjtntagocwkfkuynnlnenks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137315.5524135-512-233151857168828/AnsiballZ_command.py'
Feb 26 20:21:55 compute-0 sudo[45223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:56 compute-0 python3.9[45226]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:56 compute-0 sudo[45223]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:56 compute-0 sudo[45377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmocysrumuoxlopwtxoxioteluobabj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137316.1942544-520-154119072998629/AnsiballZ_command.py'
Feb 26 20:21:56 compute-0 sudo[45377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:56 compute-0 python3.9[45380]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:56 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 26 20:21:56 compute-0 sudo[45377]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:57 compute-0 sudo[45531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eddmqlpeyngtdfiwfrbispfuyavefdrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137316.8755932-528-54186554819881/AnsiballZ_command.py'
Feb 26 20:21:57 compute-0 sudo[45531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:57 compute-0 python3.9[45534]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:58 compute-0 sudo[45531]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:59 compute-0 sudo[45694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjnonvfmhbrmekroozmrtkztyulswawi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137318.8434076-536-242159018739713/AnsiballZ_command.py'
Feb 26 20:21:59 compute-0 sudo[45694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:59 compute-0 python3.9[45697]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:21:59 compute-0 sudo[45694]: pam_unix(sudo:session): session closed for user root
Feb 26 20:21:59 compute-0 sudo[45848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibsparngbwxmmlgwcurkkyltjkvldxrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137319.392048-544-23331602836607/AnsiballZ_systemd.py'
Feb 26 20:21:59 compute-0 sudo[45848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:21:59 compute-0 python3.9[45851]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:21:59 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 26 20:21:59 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 26 20:21:59 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 26 20:21:59 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 26 20:21:59 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 26 20:21:59 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 26 20:22:00 compute-0 sudo[45848]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:00 compute-0 sshd-session[32046]: Connection closed by 192.168.122.30 port 52774
Feb 26 20:22:00 compute-0 sshd-session[32043]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:22:00 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 26 20:22:00 compute-0 systemd[1]: session-9.scope: Consumed 2min 4.678s CPU time.
Feb 26 20:22:00 compute-0 systemd-logind[825]: Session 9 logged out. Waiting for processes to exit.
Feb 26 20:22:00 compute-0 systemd-logind[825]: Removed session 9.
Feb 26 20:22:06 compute-0 sshd-session[45881]: Accepted publickey for zuul from 192.168.122.30 port 50592 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:22:06 compute-0 systemd-logind[825]: New session 10 of user zuul.
Feb 26 20:22:06 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 26 20:22:06 compute-0 sshd-session[45881]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:22:07 compute-0 python3.9[46034]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:22:08 compute-0 python3.9[46188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:22:09 compute-0 sudo[46342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtrcrhlddyghjjkzfbghmuwyapsgcdxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137329.008909-45-259107902094923/AnsiballZ_command.py'
Feb 26 20:22:09 compute-0 sudo[46342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:09 compute-0 python3.9[46345]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:22:09 compute-0 sudo[46342]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:10 compute-0 python3.9[46496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:22:11 compute-0 sudo[46650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsgfklintilgylcuwjcajvmmqicsdaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137330.911956-65-274682828464323/AnsiballZ_setup.py'
Feb 26 20:22:11 compute-0 sudo[46650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:11 compute-0 python3.9[46653]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:22:11 compute-0 sudo[46650]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:12 compute-0 sudo[46735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdkntnanjfhtrnplijrdqfziowwrmwyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137330.911956-65-274682828464323/AnsiballZ_dnf.py'
Feb 26 20:22:12 compute-0 sudo[46735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:12 compute-0 python3.9[46738]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:22:13 compute-0 sudo[46735]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:14 compute-0 sudo[46889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imgvrfykkciwtclwennkbtajjfooiinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137333.7780657-77-82900939796611/AnsiballZ_setup.py'
Feb 26 20:22:14 compute-0 sudo[46889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:14 compute-0 python3.9[46892]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:22:14 compute-0 sudo[46889]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:15 compute-0 sudo[47061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdustvqoukmbmkxovgfqurrvknflofoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137334.6705225-88-6932002609432/AnsiballZ_file.py'
Feb 26 20:22:15 compute-0 sudo[47061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:15 compute-0 python3.9[47064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:22:15 compute-0 sudo[47061]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:15 compute-0 sudo[47214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkoxlmeqcnwxdfjwycsxaggrtvynvfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137335.4492033-96-18617922327934/AnsiballZ_command.py'
Feb 26 20:22:15 compute-0 sudo[47214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:15 compute-0 python3.9[47217]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:22:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2451975634-merged.mount: Deactivated successfully.
Feb 26 20:22:15 compute-0 podman[47218]: 2026-02-26 20:22:15.925196499 +0000 UTC m=+0.069638502 system refresh
Feb 26 20:22:15 compute-0 sudo[47214]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:16 compute-0 sudo[47381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-butnlooldygluydwkclloljdfxjvorjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137336.1477685-104-228342787193660/AnsiballZ_stat.py'
Feb 26 20:22:16 compute-0 sudo[47381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:16 compute-0 python3.9[47384]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:22:16 compute-0 sudo[47381]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:22:17 compute-0 sudo[47505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncgobwrnokmwrecgmrqffiyujpxyhvrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137336.1477685-104-228342787193660/AnsiballZ_copy.py'
Feb 26 20:22:17 compute-0 sudo[47505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:17 compute-0 python3.9[47508]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137336.1477685-104-228342787193660/.source.json follow=False _original_basename=podman_network_config.j2 checksum=8b51f8cd8d4bab055167514554377c071aa8d25e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:22:17 compute-0 sudo[47505]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:17 compute-0 sudo[47658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecwtthklnucchbjxeouarmygxdafvesg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137337.5421598-119-12130095855328/AnsiballZ_stat.py'
Feb 26 20:22:17 compute-0 sudo[47658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:18 compute-0 python3.9[47661]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:22:18 compute-0 sudo[47658]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:18 compute-0 sudo[47782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eljpkicuwnazradyfwmqfhwddsxnlwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137337.5421598-119-12130095855328/AnsiballZ_copy.py'
Feb 26 20:22:18 compute-0 sudo[47782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:18 compute-0 python3.9[47785]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137337.5421598-119-12130095855328/.source.conf follow=False _original_basename=registries.conf.j2 checksum=bd8960d09011f95ec8946d00609d580926fa47cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:22:18 compute-0 sudo[47782]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:19 compute-0 sudo[47935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbyymeeeehlrqfuraxbqgagcizimuobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137338.6933188-135-55906887856526/AnsiballZ_ini_file.py'
Feb 26 20:22:19 compute-0 sudo[47935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:19 compute-0 python3.9[47938]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:22:19 compute-0 sudo[47935]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:19 compute-0 sudo[48088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqnmujypddpyeypmaqclbqfxvyszrcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137339.476879-135-1272789194343/AnsiballZ_ini_file.py'
Feb 26 20:22:19 compute-0 sudo[48088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:19 compute-0 python3.9[48091]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:22:19 compute-0 sudo[48088]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:20 compute-0 sudo[48241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqftoddzruxldkozqhvbfykgvfgbraz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137340.0955787-135-55550464338223/AnsiballZ_ini_file.py'
Feb 26 20:22:20 compute-0 sudo[48241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:20 compute-0 python3.9[48244]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:22:20 compute-0 sudo[48241]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:20 compute-0 sudo[48394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scegbjptbfwlpswrojucqlwedyanukpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137340.6982925-135-187112143487362/AnsiballZ_ini_file.py'
Feb 26 20:22:20 compute-0 sudo[48394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:21 compute-0 python3.9[48397]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:22:21 compute-0 sudo[48394]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:22 compute-0 python3.9[48547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:22:22 compute-0 sudo[48699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lihakqtxnrcgjgxsqcecggzhdjjmtzlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137342.2780325-175-86735120831300/AnsiballZ_dnf.py'
Feb 26 20:22:22 compute-0 sudo[48699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:22 compute-0 python3.9[48702]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:23 compute-0 sudo[48699]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:24 compute-0 sudo[48853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lewqtmidbrqsrngahbheizgadcepzynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137344.098552-183-87840782627373/AnsiballZ_dnf.py'
Feb 26 20:22:24 compute-0 sudo[48853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:24 compute-0 python3.9[48856]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:26 compute-0 sudo[48853]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:26 compute-0 sudo[49015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kasepljaofljvtuypxyjwiwvqepyukbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137346.6131907-193-62209963726446/AnsiballZ_dnf.py'
Feb 26 20:22:26 compute-0 sudo[49015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:27 compute-0 python3.9[49018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:28 compute-0 sudo[49015]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:28 compute-0 sudo[49169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxfaffhobgilshancxacqublamuelqjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137348.5525026-202-218798940231102/AnsiballZ_dnf.py'
Feb 26 20:22:28 compute-0 sudo[49169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:29 compute-0 python3.9[49172]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:30 compute-0 sudo[49169]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:31 compute-0 sudo[49323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxfoyydzpuznxhqgxujslswumrtvnpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137350.6566534-213-65049950391804/AnsiballZ_dnf.py'
Feb 26 20:22:31 compute-0 sudo[49323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:31 compute-0 python3.9[49326]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:34 compute-0 sudo[49323]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:34 compute-0 sudo[49481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipmkdhmytbedmgxteknkwmpyuogjzasz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137354.3243139-221-88033200502764/AnsiballZ_dnf.py'
Feb 26 20:22:34 compute-0 sudo[49481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:34 compute-0 python3.9[49484]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:37 compute-0 sudo[49481]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:37 compute-0 sudo[49652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofhpcwhuurgkcigislgxpymkcbuyggu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137357.2912316-230-99181562405835/AnsiballZ_dnf.py'
Feb 26 20:22:37 compute-0 sudo[49652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:37 compute-0 python3.9[49655]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:38 compute-0 sudo[49652]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:39 compute-0 sudo[49806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mndownddxtgwuljujezhumgriaiaoysa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137359.1779706-239-41402970356048/AnsiballZ_dnf.py'
Feb 26 20:22:39 compute-0 sudo[49806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:39 compute-0 python3.9[49809]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:50 compute-0 sudo[49806]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:51 compute-0 sudo[50142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvtprmywefmsnwufkeqbamjgfvkkbxvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137370.8744822-248-4927177433073/AnsiballZ_dnf.py'
Feb 26 20:22:51 compute-0 sudo[50142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:51 compute-0 python3.9[50145]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:52 compute-0 sudo[50142]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:53 compute-0 sudo[50299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xotsmwyjpxhhlugbfawgctdymvnxxtiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137372.862721-258-53508816310188/AnsiballZ_dnf.py'
Feb 26 20:22:53 compute-0 sudo[50299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:53 compute-0 python3.9[50302]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:22:55 compute-0 sudo[50299]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:55 compute-0 sudo[50457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvekzdnjfmlceuybcgnhxlceqdltyemh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137375.3110185-269-123893932560007/AnsiballZ_file.py'
Feb 26 20:22:55 compute-0 sudo[50457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:55 compute-0 python3.9[50460]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:22:55 compute-0 sudo[50457]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:56 compute-0 sudo[50633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnuwxlzjqyufpxaizbixdpcvddcrcnjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137375.9055946-277-38879897493352/AnsiballZ_stat.py'
Feb 26 20:22:56 compute-0 sudo[50633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:56 compute-0 python3.9[50636]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:22:56 compute-0 sudo[50633]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:56 compute-0 sudo[50757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrocsgwjvzxtukqilndprvnrighebnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137375.9055946-277-38879897493352/AnsiballZ_copy.py'
Feb 26 20:22:56 compute-0 sudo[50757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:56 compute-0 python3.9[50760]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1772137375.9055946-277-38879897493352/.source.json _original_basename=.z2go0tbt follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:22:56 compute-0 sudo[50757]: pam_unix(sudo:session): session closed for user root
Feb 26 20:22:57 compute-0 sudo[50910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzoebopenkhhbvtgpesygugsqzmybxsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137377.210809-295-71280469517706/AnsiballZ_podman_image.py'
Feb 26 20:22:57 compute-0 sudo[50910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:22:57 compute-0 python3.9[50913]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 26 20:22:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:22:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2394112926-lower\x2dmapped.mount: Deactivated successfully.
Feb 26 20:23:02 compute-0 podman[50925]: 2026-02-26 20:23:02.719502188 +0000 UTC m=+4.709612638 image pull ce6781f051bf092c13d84cb587c56ad7edaa58b70fcc0effc1dff15724d5232e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 26 20:23:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:02 compute-0 sudo[50910]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:03 compute-0 sudo[51224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqpavktssfjzdyfpvprouiwckmvzsyvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137383.2020586-306-80719210736166/AnsiballZ_podman_image.py'
Feb 26 20:23:03 compute-0 sudo[51224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:03 compute-0 python3.9[51227]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 26 20:23:12 compute-0 podman[51239]: 2026-02-26 20:23:12.162492451 +0000 UTC m=+8.496591765 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:23:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:12 compute-0 sudo[51224]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:12 compute-0 sudo[51533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaiwjrsbzufkwfzsicqozkcxjjghstte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137392.599125-316-176785253388399/AnsiballZ_podman_image.py'
Feb 26 20:23:12 compute-0 sudo[51533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:13 compute-0 python3.9[51536]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 26 20:23:22 compute-0 podman[51547]: 2026-02-26 20:23:22.775007558 +0000 UTC m=+9.678834538 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 26 20:23:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:23 compute-0 sudo[51533]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:23 compute-0 sudo[51812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhheeojxecdrwxqewncvsujlrrpwfvbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137403.3195057-327-230028852145033/AnsiballZ_podman_image.py'
Feb 26 20:23:23 compute-0 sudo[51812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:23 compute-0 python3.9[51815]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 26 20:23:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:40 compute-0 podman[51827]: 2026-02-26 20:23:40.360236403 +0000 UTC m=+16.546854446 image pull 85a67c09da63837d01bdd446430e96c969ea53b46c93eebb5caba564f6cc2835 quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested
Feb 26 20:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:40 compute-0 sudo[51812]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:41 compute-0 sudo[52148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaxsvbndavwqztmebxvkblrarmzaiatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137420.7359173-327-205597984078642/AnsiballZ_podman_image.py'
Feb 26 20:23:41 compute-0 sudo[52148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:41 compute-0 python3.9[52151]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 26 20:23:42 compute-0 podman[52163]: 2026-02-26 20:23:42.349583323 +0000 UTC m=+1.043000703 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 26 20:23:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:42 compute-0 sudo[52148]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:42 compute-0 sshd-session[45884]: Connection closed by 192.168.122.30 port 50592
Feb 26 20:23:42 compute-0 sshd-session[45881]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:23:42 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 26 20:23:42 compute-0 systemd[1]: session-10.scope: Consumed 2min 3.191s CPU time.
Feb 26 20:23:42 compute-0 systemd-logind[825]: Session 10 logged out. Waiting for processes to exit.
Feb 26 20:23:42 compute-0 systemd-logind[825]: Removed session 10.
Feb 26 20:23:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:23:48 compute-0 sshd-session[52311]: Accepted publickey for zuul from 192.168.122.30 port 34600 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:23:48 compute-0 systemd-logind[825]: New session 11 of user zuul.
Feb 26 20:23:48 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 26 20:23:48 compute-0 sshd-session[52311]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:23:49 compute-0 python3.9[52464]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:23:50 compute-0 sudo[52618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dijpccvysnuhnkhbblqsrqwykgwmrbwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137430.306483-32-179161066186496/AnsiballZ_getent.py'
Feb 26 20:23:50 compute-0 sudo[52618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:50 compute-0 python3.9[52621]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 26 20:23:50 compute-0 sudo[52618]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:51 compute-0 sudo[52772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oerzqfwxcpuleqmjvoxuobxqrbqnoeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137431.0524821-40-29790479526782/AnsiballZ_group.py'
Feb 26 20:23:51 compute-0 sudo[52772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:51 compute-0 python3.9[52775]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:23:51 compute-0 groupadd[52776]: group added to /etc/group: name=openvswitch, GID=42476
Feb 26 20:23:51 compute-0 groupadd[52776]: group added to /etc/gshadow: name=openvswitch
Feb 26 20:23:51 compute-0 groupadd[52776]: new group: name=openvswitch, GID=42476
Feb 26 20:23:51 compute-0 sudo[52772]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:52 compute-0 sudo[52931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombvozmvrvdcchndjayuiyzhadcdwzit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137431.946564-48-55665217589477/AnsiballZ_user.py'
Feb 26 20:23:52 compute-0 sudo[52931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:52 compute-0 python3.9[52934]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 26 20:23:52 compute-0 useradd[52936]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/1
Feb 26 20:23:52 compute-0 useradd[52936]: add 'openvswitch' to group 'hugetlbfs'
Feb 26 20:23:52 compute-0 useradd[52936]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 26 20:23:52 compute-0 sudo[52931]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:53 compute-0 sudo[53092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeomdiishbtzpnwzhbvwmgitmrtrjfti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137433.0198026-58-105877728650242/AnsiballZ_setup.py'
Feb 26 20:23:53 compute-0 sudo[53092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:53 compute-0 python3.9[53095]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:23:53 compute-0 sudo[53092]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:54 compute-0 sudo[53177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aojrdzwwqdtpbfwulwlkmshrczecuppp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137433.0198026-58-105877728650242/AnsiballZ_dnf.py'
Feb 26 20:23:54 compute-0 sudo[53177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:54 compute-0 python3.9[53180]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:23:56 compute-0 sudo[53177]: pam_unix(sudo:session): session closed for user root
Feb 26 20:23:56 compute-0 sudo[53340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yocrpelegfqeyebzluxzlmehoougbbvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137436.405021-72-186074620752219/AnsiballZ_dnf.py'
Feb 26 20:23:56 compute-0 sudo[53340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:23:56 compute-0 python3.9[53343]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:24:10 compute-0 kernel: SELinux:  Converting 2740 SID table entries...
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:24:10 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:24:10 compute-0 groupadd[53366]: group added to /etc/group: name=unbound, GID=994
Feb 26 20:24:10 compute-0 groupadd[53366]: group added to /etc/gshadow: name=unbound
Feb 26 20:24:10 compute-0 groupadd[53366]: new group: name=unbound, GID=994
Feb 26 20:24:10 compute-0 useradd[53373]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 26 20:24:10 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 26 20:24:10 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 26 20:24:11 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:24:11 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:24:12 compute-0 systemd[1]: Reloading.
Feb 26 20:24:12 compute-0 systemd-rc-local-generator[53868]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:24:12 compute-0 systemd-sysv-generator[53876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:24:12 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:24:12 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:24:12 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:24:12 compute-0 systemd[1]: run-r8013597944324d7787b699812f078485.service: Deactivated successfully.
Feb 26 20:24:12 compute-0 sudo[53340]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:13 compute-0 sudo[54464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwkiornlxfarzelhwmxupriadcqdaost ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137452.908121-80-275993342073162/AnsiballZ_systemd.py'
Feb 26 20:24:13 compute-0 sudo[54464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:13 compute-0 python3.9[54467]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:24:13 compute-0 systemd[1]: Reloading.
Feb 26 20:24:13 compute-0 systemd-rc-local-generator[54500]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:24:13 compute-0 systemd-sysv-generator[54504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:24:14 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 26 20:24:14 compute-0 chown[54516]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 26 20:24:14 compute-0 ovs-ctl[54521]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 26 20:24:14 compute-0 ovs-ctl[54521]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 26 20:24:14 compute-0 ovs-ctl[54521]: Starting ovsdb-server [  OK  ]
Feb 26 20:24:14 compute-0 ovs-vsctl[54570]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 26 20:24:14 compute-0 ovs-vsctl[54590]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"62bfa765-f40e-4724-bf05-2e8b811f0867\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 26 20:24:14 compute-0 ovs-ctl[54521]: Configuring Open vSwitch system IDs [  OK  ]
Feb 26 20:24:14 compute-0 ovs-ctl[54521]: Enabling remote OVSDB managers [  OK  ]
Feb 26 20:24:14 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 26 20:24:14 compute-0 ovs-vsctl[54596]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 26 20:24:14 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 26 20:24:14 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 26 20:24:14 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 26 20:24:14 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 26 20:24:14 compute-0 ovs-ctl[54640]: Inserting openvswitch module [  OK  ]
Feb 26 20:24:14 compute-0 ovs-ctl[54609]: Starting ovs-vswitchd [  OK  ]
Feb 26 20:24:14 compute-0 ovs-vsctl[54657]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 26 20:24:14 compute-0 ovs-ctl[54609]: Enabling remote OVSDB managers [  OK  ]
Feb 26 20:24:14 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 26 20:24:14 compute-0 systemd[1]: Starting Open vSwitch...
Feb 26 20:24:14 compute-0 systemd[1]: Finished Open vSwitch.
Feb 26 20:24:14 compute-0 sudo[54464]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:15 compute-0 python3.9[54809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:24:16 compute-0 sudo[54959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvmhwalnapnuflcdfgcqdygazxziyoeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137455.6069942-99-226067140627546/AnsiballZ_sefcontext.py'
Feb 26 20:24:16 compute-0 sudo[54959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:16 compute-0 python3.9[54962]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 26 20:24:17 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:24:17 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:24:17 compute-0 sudo[54959]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:18 compute-0 python3.9[55118]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:24:19 compute-0 sudo[55274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxmugmjdcbhbpzbuvbcmubgnjddhudow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137458.9274163-117-159958289488118/AnsiballZ_dnf.py'
Feb 26 20:24:19 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 26 20:24:19 compute-0 sudo[55274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:19 compute-0 python3.9[55277]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:24:20 compute-0 sudo[55274]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:21 compute-0 sudo[55428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maykfyvgyxvnkdqbnbwohsuglimgizrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137460.8273613-125-259727350076950/AnsiballZ_command.py'
Feb 26 20:24:21 compute-0 sudo[55428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:21 compute-0 python3.9[55431]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:24:22 compute-0 sudo[55428]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:22 compute-0 sudo[55716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fitkjmtqlhrzjbhbrjssfauugwnvpaaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137462.3514936-133-123694974353716/AnsiballZ_file.py'
Feb 26 20:24:22 compute-0 sudo[55716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:22 compute-0 python3.9[55719]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 26 20:24:22 compute-0 sudo[55716]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:23 compute-0 python3.9[55869]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:24:24 compute-0 sudo[56021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sphxtckkglmzlibpvzluoahphydmbpjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137463.8804939-149-51360501557973/AnsiballZ_dnf.py'
Feb 26 20:24:24 compute-0 sudo[56021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:24 compute-0 python3.9[56024]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:24:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:24:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:24:26 compute-0 systemd[1]: Reloading.
Feb 26 20:24:26 compute-0 systemd-sysv-generator[56067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:24:26 compute-0 systemd-rc-local-generator[56060]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:24:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:24:26 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:24:26 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:24:26 compute-0 systemd[1]: run-rc974790d53a242abba42b7e2f6f39c1e.service: Deactivated successfully.
Feb 26 20:24:26 compute-0 sudo[56021]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:27 compute-0 sudo[56347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwhrvnqnciwcvuebxmecnhnnphgadfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137466.8234632-157-176026047154306/AnsiballZ_systemd.py'
Feb 26 20:24:27 compute-0 sudo[56347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:27 compute-0 python3.9[56350]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:24:27 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 26 20:24:27 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 26 20:24:27 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 26 20:24:27 compute-0 systemd[1]: Stopping Network Manager...
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5077] caught SIGTERM, shutting down normally.
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5097] dhcp4 (eth0): canceled DHCP transaction
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5097] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5097] dhcp4 (eth0): state changed no lease
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5101] manager: NetworkManager state is now CONNECTED_SITE
Feb 26 20:24:27 compute-0 NetworkManager[7682]: <info>  [1772137467.5169] exiting (success)
Feb 26 20:24:27 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 20:24:27 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 20:24:27 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 26 20:24:27 compute-0 systemd[1]: Stopped Network Manager.
Feb 26 20:24:27 compute-0 systemd[1]: NetworkManager.service: Consumed 12.072s CPU time, 4.2M memory peak, read 0B from disk, written 23.0K to disk.
Feb 26 20:24:27 compute-0 systemd[1]: Starting Network Manager...
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6070] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:f3703019-5d3d-46b0-a4dd-d38ee7f05396)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6071] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6153] manager[0x55cd3d44f000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 26 20:24:27 compute-0 systemd[1]: Starting Hostname Service...
Feb 26 20:24:27 compute-0 systemd[1]: Started Hostname Service.
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6789] hostname: hostname: using hostnamed
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6790] hostname: static hostname changed from (none) to "compute-0"
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6800] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6808] manager[0x55cd3d44f000]: rfkill: Wi-Fi hardware radio set enabled
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6808] manager[0x55cd3d44f000]: rfkill: WWAN hardware radio set enabled
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6846] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6861] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6863] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6864] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6865] manager: Networking is enabled by state file
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6868] settings: Loaded settings plugin: keyfile (internal)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6874] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6917] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6935] dhcp: init: Using DHCP client 'internal'
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6940] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6950] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6959] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6973] device (lo): Activation: starting connection 'lo' (dd0b54dd-8e74-4c4e-991a-59513cc199d2)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6982] device (eth0): carrier: link connected
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6988] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6996] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.6997] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7007] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7018] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7027] device (eth1): carrier: link connected
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7032] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7041] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (b726c5e7-e898-5b0a-8d8d-cdda95de2c7d) (indicated)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7041] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7049] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7061] device (eth1): Activation: starting connection 'ci-private-network' (b726c5e7-e898-5b0a-8d8d-cdda95de2c7d)
Feb 26 20:24:27 compute-0 systemd[1]: Started Network Manager.
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7071] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7082] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7085] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7089] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7100] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7104] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7106] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7108] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7112] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7120] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7123] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7145] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7168] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7190] dhcp4 (eth0): state changed new lease, address=38.102.83.12
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7195] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7203] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 26 20:24:27 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7280] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7286] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7294] device (lo): Activation: successful, device activated.
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7307] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7317] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7323] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7327] device (eth1): Activation: successful, device activated.
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7342] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7345] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7350] manager: NetworkManager state is now CONNECTED_SITE
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7354] device (eth0): Activation: successful, device activated.
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7362] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 26 20:24:27 compute-0 NetworkManager[56360]: <info>  [1772137467.7368] manager: startup complete
Feb 26 20:24:27 compute-0 sudo[56347]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:27 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 26 20:24:28 compute-0 sudo[56575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpzjfgmszoujpjmoushvlrdaeibidafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137467.9274454-165-192443816927320/AnsiballZ_dnf.py'
Feb 26 20:24:28 compute-0 sudo[56575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:28 compute-0 python3.9[56578]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:24:32 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:24:32 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:24:32 compute-0 systemd[1]: Reloading.
Feb 26 20:24:32 compute-0 systemd-rc-local-generator[56629]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:24:32 compute-0 systemd-sysv-generator[56635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:24:32 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:24:34 compute-0 sudo[56575]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:24:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:24:34 compute-0 systemd[1]: run-rb46193b2ba794a8c8c6f45a69c46f2ee.service: Deactivated successfully.
Feb 26 20:24:34 compute-0 sudo[57058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqmszakpuzgeohzzrwokpefzidesjpvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137474.4124112-177-113463109169178/AnsiballZ_stat.py'
Feb 26 20:24:34 compute-0 sudo[57058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:34 compute-0 python3.9[57061]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:24:34 compute-0 sudo[57058]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:35 compute-0 sudo[57211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evpolybycyjnrjqhgswtazzfytrthzjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137475.056297-186-279901719145255/AnsiballZ_ini_file.py'
Feb 26 20:24:35 compute-0 sudo[57211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:35 compute-0 python3.9[57214]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:35 compute-0 sudo[57211]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:36 compute-0 sudo[57366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfcemfsqnlfpvttibfczgqwyvsaekrhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137476.0101578-196-154416013421894/AnsiballZ_ini_file.py'
Feb 26 20:24:36 compute-0 sudo[57366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:36 compute-0 python3.9[57369]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:36 compute-0 sudo[57366]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:36 compute-0 sudo[57519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpvrdugwcvbkjhsmwlmnulsqgquwbxue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137476.6706786-196-724150081749/AnsiballZ_ini_file.py'
Feb 26 20:24:36 compute-0 sudo[57519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:37 compute-0 python3.9[57522]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:37 compute-0 sudo[57519]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:37 compute-0 sudo[57672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntabnbngxcvngkohcnseqvxnrgghotne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137477.2614884-211-27131263483995/AnsiballZ_ini_file.py'
Feb 26 20:24:37 compute-0 sudo[57672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:37 compute-0 python3.9[57675]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:37 compute-0 sudo[57672]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:37 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 20:24:38 compute-0 sudo[57825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrhqjchfielynqpvbgefxxybuennkzsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137477.8807251-211-138648354836212/AnsiballZ_ini_file.py'
Feb 26 20:24:38 compute-0 sudo[57825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:38 compute-0 python3.9[57828]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:38 compute-0 sudo[57825]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:38 compute-0 sudo[57978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajvizmhvapgfmcfzxaoikeemmfgezuap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137478.4847915-226-71912146026737/AnsiballZ_stat.py'
Feb 26 20:24:38 compute-0 sudo[57978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:38 compute-0 python3.9[57981]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:24:38 compute-0 sudo[57978]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:39 compute-0 sudo[58102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqpvgsljxfvktcrbxmmrzagpzemzkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137478.4847915-226-71912146026737/AnsiballZ_copy.py'
Feb 26 20:24:39 compute-0 sudo[58102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:39 compute-0 python3.9[58105]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137478.4847915-226-71912146026737/.source _original_basename=.lqrjd_39 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:39 compute-0 sudo[58102]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:39 compute-0 sudo[58255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhaayylqbgqgardhfrmmffyysfqlddwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137479.700113-241-262830009988666/AnsiballZ_file.py'
Feb 26 20:24:39 compute-0 sudo[58255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:40 compute-0 python3.9[58258]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:40 compute-0 sudo[58255]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:40 compute-0 sudo[58408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiawbwnpnrdzscmzfvytnarpjxtglutp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137480.2860487-249-163914118270916/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 26 20:24:40 compute-0 sudo[58408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:40 compute-0 python3.9[58411]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 26 20:24:40 compute-0 sudo[58408]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:41 compute-0 sudo[58561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcefjnjtbzgzrbwyitdsdzbriswumjfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137481.016729-258-240528296452682/AnsiballZ_file.py'
Feb 26 20:24:41 compute-0 sudo[58561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:41 compute-0 python3.9[58564]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:41 compute-0 sudo[58561]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:42 compute-0 sudo[58714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eviaoczodfmjhojgyrqxvtbopfhjuzws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137481.763037-268-129218327182597/AnsiballZ_stat.py'
Feb 26 20:24:42 compute-0 sudo[58714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:42 compute-0 sudo[58714]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:42 compute-0 sudo[58838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sawgzzlacpmykyxwgwyhnltrdrlwlnap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137481.763037-268-129218327182597/AnsiballZ_copy.py'
Feb 26 20:24:42 compute-0 sudo[58838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:42 compute-0 sudo[58838]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:43 compute-0 sudo[58991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syyleuzdpjucjhtmxcwwaghvgdzijnvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137482.8770745-283-16557109882404/AnsiballZ_slurp.py'
Feb 26 20:24:43 compute-0 sudo[58991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:43 compute-0 python3.9[58994]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 26 20:24:43 compute-0 sudo[58991]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:44 compute-0 sudo[59167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwnljtbmnbdjbjltxxtkxdtzyybnzoxa ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137483.617237-292-204195296888037/async_wrapper.py j551968951150 300 /home/zuul/.ansible/tmp/ansible-tmp-1772137483.617237-292-204195296888037/AnsiballZ_edpm_os_net_config.py _'
Feb 26 20:24:44 compute-0 sudo[59167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:44 compute-0 ansible-async_wrapper.py[59170]: Invoked with j551968951150 300 /home/zuul/.ansible/tmp/ansible-tmp-1772137483.617237-292-204195296888037/AnsiballZ_edpm_os_net_config.py _
Feb 26 20:24:44 compute-0 ansible-async_wrapper.py[59173]: Starting module and watcher
Feb 26 20:24:44 compute-0 ansible-async_wrapper.py[59173]: Start watching 59174 (300)
Feb 26 20:24:44 compute-0 ansible-async_wrapper.py[59174]: Start module (59174)
Feb 26 20:24:44 compute-0 ansible-async_wrapper.py[59170]: Return async_wrapper task started.
Feb 26 20:24:44 compute-0 sudo[59167]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:44 compute-0 python3.9[59175]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 26 20:24:45 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 26 20:24:45 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 26 20:24:45 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 26 20:24:45 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 26 20:24:45 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5347] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5363] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5895] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5898] audit: op="connection-add" uuid="282015b3-88b6-428d-a1a5-aa767a21b19e" name="br-ex-br" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5913] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5915] audit: op="connection-add" uuid="6b799915-fb36-410f-a0f9-fdb54860b3f3" name="br-ex-port" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5927] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5928] audit: op="connection-add" uuid="3e9abaf9-1045-4c87-aa16-3483ce6ca6fe" name="eth1-port" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5940] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5942] audit: op="connection-add" uuid="4cba6ccb-2ae2-4e58-935a-be50eed02f45" name="vlan20-port" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5953] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5955] audit: op="connection-add" uuid="284fd80c-c567-4541-835d-d9d325f212b0" name="vlan21-port" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5964] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5966] audit: op="connection-add" uuid="832ba9ce-60a8-4138-af1b-da3242f15cd3" name="vlan22-port" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.5985] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority,connection.timestamp" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6000] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6001] audit: op="connection-add" uuid="e499b843-0b74-4c9b-b668-a752c02bbe2c" name="br-ex-if" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6062] audit: op="connection-update" uuid="b726c5e7-e898-5b0a-8d8d-cdda95de2c7d" name="ci-private-network" args="ipv6.routes,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.method,ovs-external-ids.data,ovs-interface.type,ipv4.never-default,ipv4.dns,ipv4.addresses,ipv4.routes,ipv4.method,ipv4.routing-rules,connection.controller,connection.master,connection.timestamp,connection.port-type,connection.slave-type" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6078] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6080] audit: op="connection-add" uuid="2ad56b95-091b-4aa7-b5c0-af9ee29d79cf" name="vlan20-if" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6094] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6096] audit: op="connection-add" uuid="1bba2d80-4fa7-4d45-9f16-747e1150171d" name="vlan21-if" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6111] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6112] audit: op="connection-add" uuid="b044e8e3-f14c-48f1-87fe-c5e799d11731" name="vlan22-if" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6123] audit: op="connection-delete" uuid="6899c43b-9685-3591-bd57-0b0b5009002c" name="Wired connection 1" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6140] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6143] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6149] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6153] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (282015b3-88b6-428d-a1a5-aa767a21b19e)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6153] audit: op="connection-activate" uuid="282015b3-88b6-428d-a1a5-aa767a21b19e" name="br-ex-br" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6155] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6156] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6161] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6166] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (6b799915-fb36-410f-a0f9-fdb54860b3f3)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6168] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6169] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6174] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6179] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (3e9abaf9-1045-4c87-aa16-3483ce6ca6fe)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6181] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6181] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6187] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6191] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (4cba6ccb-2ae2-4e58-935a-be50eed02f45)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6193] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6194] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6199] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6203] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (284fd80c-c567-4541-835d-d9d325f212b0)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6205] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6206] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6211] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6216] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (832ba9ce-60a8-4138-af1b-da3242f15cd3)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6217] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6220] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6222] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6230] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6231] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6234] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6239] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e499b843-0b74-4c9b-b668-a752c02bbe2c)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6240] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6244] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6246] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6247] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6249] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6263] device (eth1): disconnecting for new activation request.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6264] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6268] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6270] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6272] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6275] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6276] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6280] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6285] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (2ad56b95-091b-4aa7-b5c0-af9ee29d79cf)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6286] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6290] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6292] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6294] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6298] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6300] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6304] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6310] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (1bba2d80-4fa7-4d45-9f16-747e1150171d)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6311] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6314] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6317] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6319] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6323] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <warn>  [1772137486.6324] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6328] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6334] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (b044e8e3-f14c-48f1-87fe-c5e799d11731)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6335] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6339] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6341] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6343] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6345] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6360] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,connection.autoconnect-priority" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6363] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6367] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6369] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6376] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6382] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6386] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6389] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6391] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6395] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6399] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6403] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6404] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6409] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6413] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6417] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6420] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6424] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 kernel: Timeout policy base is empty
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6430] dhcp4 (eth0): canceled DHCP transaction
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6430] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6430] dhcp4 (eth0): state changed no lease
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6431] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 26 20:24:46 compute-0 systemd-udevd[59182]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6441] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6444] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59176 uid=0 result="fail" reason="Device is not activated"
Feb 26 20:24:46 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6481] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6486] dhcp4 (eth0): state changed new lease, address=38.102.83.12
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6532] device (eth1): disconnecting for new activation request.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6533] audit: op="connection-activate" uuid="b726c5e7-e898-5b0a-8d8d-cdda95de2c7d" name="ci-private-network" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6581] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 26 20:24:46 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6588] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6687] device (eth1): Activation: starting connection 'ci-private-network' (b726c5e7-e898-5b0a-8d8d-cdda95de2c7d)
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6696] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 26 20:24:46 compute-0 kernel: br-ex: entered promiscuous mode
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6711] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6714] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6720] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59176 uid=0 result="success"
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6721] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6722] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6723] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6725] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6726] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6727] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6733] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6740] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6744] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6747] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6751] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6755] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6758] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6761] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6764] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6767] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 kernel: vlan22: entered promiscuous mode
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6771] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 systemd-udevd[59181]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6774] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6782] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6787] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6791] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 kernel: vlan21: entered promiscuous mode
Feb 26 20:24:46 compute-0 systemd-udevd[59180]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6843] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6846] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6850] device (eth1): Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6856] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6864] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 kernel: vlan20: entered promiscuous mode
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6893] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6897] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6901] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 systemd-udevd[59276]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:24:46 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6946] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6955] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6980] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.6989] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7001] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7002] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7005] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7017] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7018] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7021] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7024] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7038] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7080] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7083] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 26 20:24:46 compute-0 NetworkManager[56360]: <info>  [1772137486.7088] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 26 20:24:47 compute-0 NetworkManager[56360]: <info>  [1772137487.8385] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59176 uid=0 result="success"
Feb 26 20:24:47 compute-0 NetworkManager[56360]: <info>  [1772137487.9927] checkpoint[0x55cd3d424950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 26 20:24:47 compute-0 NetworkManager[56360]: <info>  [1772137487.9930] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 sudo[59510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqjickfuctiqiyebuqnzieuovtsgywyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137487.7662923-292-222428320759395/AnsiballZ_async_status.py'
Feb 26 20:24:48 compute-0 sudo[59510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.2433] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.2445] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 python3.9[59513]: ansible-ansible.legacy.async_status Invoked with jid=j551968951150.59170 mode=status _async_dir=/root/.ansible_async
Feb 26 20:24:48 compute-0 sudo[59510]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.4528] audit: op="networking-control" arg="global-dns-configuration" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.4556] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.4586] audit: op="networking-control" arg="global-dns-configuration" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.4612] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.5953] checkpoint[0x55cd3d424a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 26 20:24:48 compute-0 NetworkManager[56360]: <info>  [1772137488.5964] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59176 uid=0 result="success"
Feb 26 20:24:48 compute-0 ansible-async_wrapper.py[59174]: Module complete (59174)
Feb 26 20:24:49 compute-0 irqbalance[809]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 26 20:24:49 compute-0 irqbalance[809]: IRQ 26 affinity is now unmanaged
Feb 26 20:24:49 compute-0 ansible-async_wrapper.py[59173]: Done in kid B.
Feb 26 20:24:51 compute-0 sudo[59615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpphqqxphiithctotyhjvrlzgcvfxjlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137487.7662923-292-222428320759395/AnsiballZ_async_status.py'
Feb 26 20:24:51 compute-0 sudo[59615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:51 compute-0 python3.9[59618]: ansible-ansible.legacy.async_status Invoked with jid=j551968951150.59170 mode=status _async_dir=/root/.ansible_async
Feb 26 20:24:51 compute-0 sudo[59615]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:52 compute-0 sudo[59716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pngjsuuthrkgpteikmijeysdfhkoyrot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137487.7662923-292-222428320759395/AnsiballZ_async_status.py'
Feb 26 20:24:52 compute-0 sudo[59716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:52 compute-0 python3.9[59719]: ansible-ansible.legacy.async_status Invoked with jid=j551968951150.59170 mode=cleanup _async_dir=/root/.ansible_async
Feb 26 20:24:52 compute-0 sudo[59716]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:52 compute-0 sudo[59869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkttdpzqhpzahshtlscgfdknuzbusxrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137492.6328173-319-20230396066949/AnsiballZ_stat.py'
Feb 26 20:24:52 compute-0 sudo[59869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:53 compute-0 python3.9[59872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:24:53 compute-0 sudo[59869]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:53 compute-0 sudo[59993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkpmmqptitpozoejftpskglenlotibnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137492.6328173-319-20230396066949/AnsiballZ_copy.py'
Feb 26 20:24:53 compute-0 sudo[59993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:53 compute-0 python3.9[59996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137492.6328173-319-20230396066949/.source.returncode _original_basename=.d62hl5ze follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:53 compute-0 sudo[59993]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:54 compute-0 sudo[60146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wedlpylyuefvfrxmffqsaulpsuwasgjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137493.7727098-335-251489419212422/AnsiballZ_stat.py'
Feb 26 20:24:54 compute-0 sudo[60146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:54 compute-0 python3.9[60149]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:24:54 compute-0 sudo[60146]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:54 compute-0 sudo[60270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljclkmypspqatmlrdnmuyetgrukhtihz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137493.7727098-335-251489419212422/AnsiballZ_copy.py'
Feb 26 20:24:54 compute-0 sudo[60270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:54 compute-0 python3.9[60273]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137493.7727098-335-251489419212422/.source.cfg _original_basename=.60x6lkag follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:24:54 compute-0 sudo[60270]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:55 compute-0 sudo[60424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opwbbvpsailkecpuzjkjmqjvmdxxzxar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137494.8382344-350-46439916884650/AnsiballZ_systemd.py'
Feb 26 20:24:55 compute-0 sudo[60424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:24:55 compute-0 python3.9[60427]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:24:55 compute-0 systemd[1]: Reloading Network Manager...
Feb 26 20:24:55 compute-0 NetworkManager[56360]: <info>  [1772137495.4048] audit: op="reload" arg="0" pid=60431 uid=0 result="success"
Feb 26 20:24:55 compute-0 NetworkManager[56360]: <info>  [1772137495.4056] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 26 20:24:55 compute-0 systemd[1]: Reloaded Network Manager.
Feb 26 20:24:55 compute-0 sudo[60424]: pam_unix(sudo:session): session closed for user root
Feb 26 20:24:55 compute-0 sshd-session[52314]: Connection closed by 192.168.122.30 port 34600
Feb 26 20:24:55 compute-0 sshd-session[52311]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:24:55 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 26 20:24:55 compute-0 systemd-logind[825]: Session 11 logged out. Waiting for processes to exit.
Feb 26 20:24:55 compute-0 systemd[1]: session-11.scope: Consumed 47.852s CPU time.
Feb 26 20:24:55 compute-0 systemd-logind[825]: Removed session 11.
Feb 26 20:24:57 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 26 20:25:01 compute-0 sshd-session[60463]: Accepted publickey for zuul from 192.168.122.30 port 45196 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:25:01 compute-0 systemd-logind[825]: New session 12 of user zuul.
Feb 26 20:25:01 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 26 20:25:01 compute-0 sshd-session[60463]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:25:02 compute-0 python3.9[60617]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:25:03 compute-0 python3.9[60771]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:25:04 compute-0 python3.9[60960]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:25:05 compute-0 sshd-session[60466]: Connection closed by 192.168.122.30 port 45196
Feb 26 20:25:05 compute-0 sshd-session[60463]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:25:05 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 26 20:25:05 compute-0 systemd[1]: session-12.scope: Consumed 2.182s CPU time.
Feb 26 20:25:05 compute-0 systemd-logind[825]: Session 12 logged out. Waiting for processes to exit.
Feb 26 20:25:05 compute-0 systemd-logind[825]: Removed session 12.
Feb 26 20:25:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 26 20:25:10 compute-0 sshd-session[60989]: Accepted publickey for zuul from 192.168.122.30 port 53548 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:25:10 compute-0 systemd-logind[825]: New session 13 of user zuul.
Feb 26 20:25:10 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 26 20:25:10 compute-0 sshd-session[60989]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:25:11 compute-0 python3.9[61143]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:25:12 compute-0 python3.9[61297]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:25:12 compute-0 sudo[61451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msaqspruwsefstgzcuudlbhupbwuqdjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137512.7227027-35-54880639367704/AnsiballZ_setup.py'
Feb 26 20:25:12 compute-0 sudo[61451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:13 compute-0 python3.9[61454]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:25:13 compute-0 sudo[61451]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:13 compute-0 sudo[61536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omuvkemzaxtvbcysmtrttuvqzbmjrdgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137512.7227027-35-54880639367704/AnsiballZ_dnf.py'
Feb 26 20:25:13 compute-0 sudo[61536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:14 compute-0 python3.9[61539]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:25:15 compute-0 sudo[61536]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:15 compute-0 sudo[61691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clilahtmvnevexrzdebdoovrmuwzfois ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137515.565858-47-2345870569056/AnsiballZ_setup.py'
Feb 26 20:25:15 compute-0 sudo[61691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:16 compute-0 python3.9[61694]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:25:16 compute-0 sudo[61691]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:16 compute-0 sudo[61883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grysccgfkhvrtbuxzoeuzftqoojjxkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137516.5448942-58-186312521169181/AnsiballZ_file.py'
Feb 26 20:25:16 compute-0 sudo[61883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:17 compute-0 python3.9[61886]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:17 compute-0 sudo[61883]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:17 compute-0 sudo[62037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgtxzmauyvbxixuwnpsunzuxnpeqeplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137517.261818-66-234967462849050/AnsiballZ_command.py'
Feb 26 20:25:17 compute-0 sudo[62037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:17 compute-0 python3.9[62040]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:25:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:25:17 compute-0 sudo[62037]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:18 compute-0 sudo[62201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdappdvjhwkciolvrmfcclgeggvyoydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137518.1000009-74-9964985322596/AnsiballZ_stat.py'
Feb 26 20:25:18 compute-0 sudo[62201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:18 compute-0 python3.9[62204]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:18 compute-0 sudo[62201]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:19 compute-0 sudo[62280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unebneholrjoledmwakgnjbvvbonfrxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137518.1000009-74-9964985322596/AnsiballZ_file.py'
Feb 26 20:25:19 compute-0 sudo[62280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:19 compute-0 python3.9[62283]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:19 compute-0 sudo[62280]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:19 compute-0 sudo[62433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afeqpyegqrxxdtakqvuapzdylaovplwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137519.5527544-86-238542838416537/AnsiballZ_stat.py'
Feb 26 20:25:19 compute-0 sudo[62433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:19 compute-0 python3.9[62436]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:20 compute-0 sudo[62433]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:20 compute-0 sudo[62512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrjtbrwdamjigvflgnlhxyocetptxho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137519.5527544-86-238542838416537/AnsiballZ_file.py'
Feb 26 20:25:20 compute-0 sudo[62512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:20 compute-0 python3.9[62515]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:20 compute-0 sudo[62512]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:20 compute-0 sudo[62665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-barjydxgmkqotplhwbnsvxpzpdcczykj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137520.5756657-99-133141899604290/AnsiballZ_ini_file.py'
Feb 26 20:25:20 compute-0 sudo[62665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:21 compute-0 python3.9[62668]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:21 compute-0 sudo[62665]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:21 compute-0 sudo[62818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxlmydgtxrorvlxgidijxdpdjxpbjuhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137521.2657416-99-175875733788189/AnsiballZ_ini_file.py'
Feb 26 20:25:21 compute-0 sudo[62818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:21 compute-0 python3.9[62821]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:21 compute-0 sudo[62818]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:22 compute-0 sudo[62971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjsicpducpfspcwefmisdvxhvenxmbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137521.8371553-99-120341564723813/AnsiballZ_ini_file.py'
Feb 26 20:25:22 compute-0 sudo[62971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:22 compute-0 python3.9[62974]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:22 compute-0 sudo[62971]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:22 compute-0 sudo[63124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfdwseycdzugyorbijyrsuvkbwwiatrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137522.4499655-99-22565907956988/AnsiballZ_ini_file.py'
Feb 26 20:25:22 compute-0 sudo[63124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:22 compute-0 python3.9[63127]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:22 compute-0 sudo[63124]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:23 compute-0 sudo[63277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbxseifqnkdzvmwakshcjkhcwrtmkkoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137523.1171122-130-269904167069865/AnsiballZ_dnf.py'
Feb 26 20:25:23 compute-0 sudo[63277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:23 compute-0 python3.9[63280]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:25:24 compute-0 sudo[63277]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:25 compute-0 sudo[63431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcrwjumvejrecglvgqaydqqhmckabukw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137525.3672476-141-168920482259054/AnsiballZ_setup.py'
Feb 26 20:25:25 compute-0 sudo[63431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:25 compute-0 python3.9[63434]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:25:25 compute-0 sudo[63431]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:26 compute-0 sudo[63586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkmcoghjqwrodtwujgupdexgmhqjmazl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137526.0974886-149-152703273177897/AnsiballZ_stat.py'
Feb 26 20:25:26 compute-0 sudo[63586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:26 compute-0 python3.9[63589]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:25:26 compute-0 sudo[63586]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:27 compute-0 sudo[63739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhuwknewelwihfdidcehpfrnfmzaszl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137526.800032-158-262049521156524/AnsiballZ_stat.py'
Feb 26 20:25:27 compute-0 sudo[63739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:27 compute-0 python3.9[63742]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:25:27 compute-0 sudo[63739]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:27 compute-0 sudo[63892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpvvkwttsuhqjbtrtojosmvlxesqdkrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137527.679297-168-129030187662465/AnsiballZ_command.py'
Feb 26 20:25:27 compute-0 sudo[63892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:28 compute-0 python3.9[63895]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:25:28 compute-0 sudo[63892]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:28 compute-0 sudo[64046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpxtxrualmsycrckzlwltkxfbqstiihm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137528.4694502-178-124371424471037/AnsiballZ_service_facts.py'
Feb 26 20:25:28 compute-0 sudo[64046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:29 compute-0 python3.9[64049]: ansible-service_facts Invoked
Feb 26 20:25:29 compute-0 network[64066]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:25:29 compute-0 network[64067]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:25:29 compute-0 network[64068]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:25:33 compute-0 sudo[64046]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:34 compute-0 sudo[64352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdatjjxsowqiizbdxfnhvlprzwiiowt ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1772137534.4812307-193-21505951295122/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1772137534.4812307-193-21505951295122/args'
Feb 26 20:25:34 compute-0 sudo[64352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:34 compute-0 sudo[64352]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:35 compute-0 sudo[64520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cszqqmpbbaoswrddixciehnfjeqfcfgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137535.077592-204-116380212324249/AnsiballZ_dnf.py'
Feb 26 20:25:35 compute-0 sudo[64520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:35 compute-0 python3.9[64523]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:25:36 compute-0 sudo[64520]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:37 compute-0 sudo[64674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqluznpfphnedyimulkgoxuojzpkxcog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137537.2005525-217-138966223785086/AnsiballZ_package_facts.py'
Feb 26 20:25:37 compute-0 sudo[64674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:38 compute-0 python3.9[64677]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 26 20:25:38 compute-0 sudo[64674]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:39 compute-0 sudo[64827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjvkkmqodupbhcvktnlikovvemeelztk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137538.7883265-227-122782298385693/AnsiballZ_stat.py'
Feb 26 20:25:39 compute-0 sudo[64827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:39 compute-0 python3.9[64830]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:39 compute-0 sudo[64827]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:39 compute-0 sudo[64953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dawgtohyozbkngwgeizhwoayefkosjfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137538.7883265-227-122782298385693/AnsiballZ_copy.py'
Feb 26 20:25:39 compute-0 sudo[64953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:40 compute-0 python3.9[64956]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137538.7883265-227-122782298385693/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:40 compute-0 sudo[64953]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:40 compute-0 sudo[65108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhptwratfrjmmbspojjaqexvfueycqua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137540.3424284-242-45583678519909/AnsiballZ_stat.py'
Feb 26 20:25:40 compute-0 sudo[65108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:40 compute-0 python3.9[65111]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:40 compute-0 sudo[65108]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:41 compute-0 sudo[65234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omvcuxlfggnqafghompwhrjwdabnxusq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137540.3424284-242-45583678519909/AnsiballZ_copy.py'
Feb 26 20:25:41 compute-0 sudo[65234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:41 compute-0 python3.9[65237]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137540.3424284-242-45583678519909/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:41 compute-0 sudo[65234]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:42 compute-0 sudo[65389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdgnewurhkeurmnlxpaaubooccgsalb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137541.8156874-263-189424446275885/AnsiballZ_lineinfile.py'
Feb 26 20:25:42 compute-0 sudo[65389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:42 compute-0 python3.9[65392]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:42 compute-0 sudo[65389]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:43 compute-0 sudo[65544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dudxvycgwfcttcjssjnbpvthoexbojyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137542.912727-278-26027035406288/AnsiballZ_setup.py'
Feb 26 20:25:43 compute-0 sudo[65544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:43 compute-0 python3.9[65547]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:25:43 compute-0 sudo[65544]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:44 compute-0 sudo[65629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxirufbdxpegubzyfzulwobgeycjibbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137542.912727-278-26027035406288/AnsiballZ_systemd.py'
Feb 26 20:25:44 compute-0 sudo[65629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:44 compute-0 python3.9[65632]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:25:44 compute-0 sudo[65629]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:45 compute-0 sudo[65784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvswlbkbzikzzklveyvenzcauvzlvzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137545.2380428-294-29901679234885/AnsiballZ_setup.py'
Feb 26 20:25:45 compute-0 sudo[65784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:45 compute-0 python3.9[65787]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:25:46 compute-0 sudo[65784]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:46 compute-0 sudo[65869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srjpsiptmelrklczuemqgplxylrurqcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137545.2380428-294-29901679234885/AnsiballZ_systemd.py'
Feb 26 20:25:46 compute-0 sudo[65869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:46 compute-0 python3.9[65872]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:25:46 compute-0 chronyd[815]: chronyd exiting
Feb 26 20:25:46 compute-0 systemd[1]: Stopping NTP client/server...
Feb 26 20:25:46 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 26 20:25:46 compute-0 systemd[1]: Stopped NTP client/server.
Feb 26 20:25:46 compute-0 systemd[1]: Starting NTP client/server...
Feb 26 20:25:46 compute-0 chronyd[65881]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 26 20:25:46 compute-0 chronyd[65881]: Frequency -26.433 +/- 0.466 ppm read from /var/lib/chrony/drift
Feb 26 20:25:46 compute-0 chronyd[65881]: Loaded seccomp filter (level 2)
Feb 26 20:25:46 compute-0 systemd[1]: Started NTP client/server.
Feb 26 20:25:46 compute-0 sudo[65869]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:47 compute-0 sshd-session[60992]: Connection closed by 192.168.122.30 port 53548
Feb 26 20:25:47 compute-0 sshd-session[60989]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:25:47 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 26 20:25:47 compute-0 systemd[1]: session-13.scope: Consumed 24.585s CPU time.
Feb 26 20:25:47 compute-0 systemd-logind[825]: Session 13 logged out. Waiting for processes to exit.
Feb 26 20:25:47 compute-0 systemd-logind[825]: Removed session 13.
Feb 26 20:25:52 compute-0 sshd-session[65907]: Accepted publickey for zuul from 192.168.122.30 port 48064 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:25:52 compute-0 systemd-logind[825]: New session 14 of user zuul.
Feb 26 20:25:52 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 26 20:25:53 compute-0 sshd-session[65907]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:25:54 compute-0 python3.9[66060]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:25:54 compute-0 sudo[66214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzuybaqsbgvmlnlwnsfsvpamffwxuitu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137554.3805356-28-110727766819143/AnsiballZ_file.py'
Feb 26 20:25:54 compute-0 sudo[66214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:55 compute-0 python3.9[66217]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:55 compute-0 sudo[66214]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:55 compute-0 sudo[66390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqvyapjnuqtkivhrbtlnbbhktjpcokrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137555.2749803-36-102061309369276/AnsiballZ_stat.py'
Feb 26 20:25:55 compute-0 sudo[66390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:55 compute-0 python3.9[66393]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:55 compute-0 sudo[66390]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:56 compute-0 sudo[66469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaodvpjppbutpcysszzelbnzsltfsiro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137555.2749803-36-102061309369276/AnsiballZ_file.py'
Feb 26 20:25:56 compute-0 sudo[66469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:56 compute-0 python3.9[66472]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.45jrclty recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:56 compute-0 sudo[66469]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:56 compute-0 sudo[66622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uedtmhgvcqhxowcglopuqnqmoqeldbhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137556.6115324-56-262365004961938/AnsiballZ_stat.py'
Feb 26 20:25:56 compute-0 sudo[66622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:56 compute-0 python3.9[66625]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:57 compute-0 sudo[66622]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:57 compute-0 sudo[66746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idutouegbiwtqpbpjjqadlsdfumkbroi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137556.6115324-56-262365004961938/AnsiballZ_copy.py'
Feb 26 20:25:57 compute-0 sudo[66746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:57 compute-0 python3.9[66749]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137556.6115324-56-262365004961938/.source _original_basename=.qm0uioxx follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:25:57 compute-0 sudo[66746]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:58 compute-0 sudo[66899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnfvxtehxjnamyeyoocgxwejcelcikcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137557.7645144-72-165671526647386/AnsiballZ_file.py'
Feb 26 20:25:58 compute-0 sudo[66899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:58 compute-0 python3.9[66902]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:58 compute-0 sudo[66899]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:58 compute-0 sudo[67052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlyzsfhgzyvbxknlovemgupzemmdnbtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137558.4386263-80-196196587728857/AnsiballZ_stat.py'
Feb 26 20:25:58 compute-0 sudo[67052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:58 compute-0 python3.9[67055]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:58 compute-0 sudo[67052]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:59 compute-0 sudo[67176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viidihjsszcxglyngfzklkchamwrnuvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137558.4386263-80-196196587728857/AnsiballZ_copy.py'
Feb 26 20:25:59 compute-0 sudo[67176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:59 compute-0 python3.9[67179]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137558.4386263-80-196196587728857/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:25:59 compute-0 sudo[67176]: pam_unix(sudo:session): session closed for user root
Feb 26 20:25:59 compute-0 sudo[67329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fglppcfotdqpdgoyhwucxpmajwmgoigq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137559.445158-80-128439846463406/AnsiballZ_stat.py'
Feb 26 20:25:59 compute-0 sudo[67329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:25:59 compute-0 python3.9[67332]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:25:59 compute-0 sudo[67329]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:00 compute-0 sudo[67453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypmjadbghnxncyxdscfzgrubcyvpzbtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137559.445158-80-128439846463406/AnsiballZ_copy.py'
Feb 26 20:26:00 compute-0 sudo[67453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:00 compute-0 python3.9[67456]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137559.445158-80-128439846463406/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:26:00 compute-0 sudo[67453]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:00 compute-0 sudo[67606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whetkzqajarhpvxcwlnngmhlcbqgqska ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137560.620671-109-88537708643898/AnsiballZ_file.py'
Feb 26 20:26:00 compute-0 sudo[67606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:01 compute-0 python3.9[67609]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:01 compute-0 sudo[67606]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:01 compute-0 sudo[67759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpfwwarvxikotirabvqqsevyyzpiiumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137561.1700342-117-222733715833837/AnsiballZ_stat.py'
Feb 26 20:26:01 compute-0 sudo[67759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:01 compute-0 python3.9[67762]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:01 compute-0 sudo[67759]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:01 compute-0 sudo[67883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzlwzgjpgqsdzrnmfszduqefmhlskfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137561.1700342-117-222733715833837/AnsiballZ_copy.py'
Feb 26 20:26:01 compute-0 sudo[67883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:02 compute-0 python3.9[67886]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137561.1700342-117-222733715833837/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:02 compute-0 sudo[67883]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:02 compute-0 sudo[68036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lertnjwrjozbskwmpzqaxdcrkhrvmndr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137562.3006063-132-229580925782974/AnsiballZ_stat.py'
Feb 26 20:26:02 compute-0 sudo[68036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:02 compute-0 python3.9[68039]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:02 compute-0 sudo[68036]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:03 compute-0 sudo[68160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jktyimkotygygtyfbxpdkyrrfoewguxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137562.3006063-132-229580925782974/AnsiballZ_copy.py'
Feb 26 20:26:03 compute-0 sudo[68160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:03 compute-0 python3.9[68163]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137562.3006063-132-229580925782974/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:03 compute-0 sudo[68160]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:03 compute-0 sudo[68313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjgrvubdrflftbveyghdpfcizrpkvpez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137563.422294-147-51712691231587/AnsiballZ_systemd.py'
Feb 26 20:26:03 compute-0 sudo[68313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:04 compute-0 python3.9[68316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:26:04 compute-0 systemd[1]: Reloading.
Feb 26 20:26:04 compute-0 systemd-rc-local-generator[68342]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:04 compute-0 systemd-sysv-generator[68348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:04 compute-0 systemd[1]: Reloading.
Feb 26 20:26:04 compute-0 systemd-rc-local-generator[68385]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:04 compute-0 systemd-sysv-generator[68390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:04 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 26 20:26:04 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 26 20:26:04 compute-0 sudo[68313]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:05 compute-0 sudo[68555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orngumirruefljoduuhfgpwcgjvwxiig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137564.9711273-155-233821682060259/AnsiballZ_stat.py'
Feb 26 20:26:05 compute-0 sudo[68555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:05 compute-0 python3.9[68558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:05 compute-0 sudo[68555]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:05 compute-0 sudo[68679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zohqnlhawiwjadkftggroeagtgxqckqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137564.9711273-155-233821682060259/AnsiballZ_copy.py'
Feb 26 20:26:05 compute-0 sudo[68679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:05 compute-0 python3.9[68682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137564.9711273-155-233821682060259/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:05 compute-0 sudo[68679]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:06 compute-0 sudo[68832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlapgnbnefvkmkcvdzwshehwhgiviyhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137566.1366644-170-8376403293024/AnsiballZ_stat.py'
Feb 26 20:26:06 compute-0 sudo[68832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:06 compute-0 python3.9[68835]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:06 compute-0 sudo[68832]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:06 compute-0 sudo[68956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udhcvlxhvkgrwxqcxbwqphssntimepja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137566.1366644-170-8376403293024/AnsiballZ_copy.py'
Feb 26 20:26:06 compute-0 sudo[68956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:07 compute-0 python3.9[68959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137566.1366644-170-8376403293024/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:07 compute-0 sudo[68956]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:07 compute-0 sudo[69109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjcnzfwcvggyreqwuyzspevwolaqueid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137567.285777-185-56754954627488/AnsiballZ_systemd.py'
Feb 26 20:26:07 compute-0 sudo[69109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:07 compute-0 python3.9[69112]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:26:07 compute-0 systemd[1]: Reloading.
Feb 26 20:26:07 compute-0 systemd-rc-local-generator[69143]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:07 compute-0 systemd-sysv-generator[69147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:08 compute-0 systemd[1]: Reloading.
Feb 26 20:26:08 compute-0 systemd-rc-local-generator[69182]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:08 compute-0 systemd-sysv-generator[69188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:08 compute-0 systemd[1]: Starting Create netns directory...
Feb 26 20:26:08 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 26 20:26:08 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 26 20:26:08 compute-0 systemd[1]: Finished Create netns directory.
Feb 26 20:26:08 compute-0 sudo[69109]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:09 compute-0 python3.9[69353]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:26:09 compute-0 network[69370]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:26:09 compute-0 network[69371]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:26:09 compute-0 network[69372]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:26:12 compute-0 sudo[69633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofbbnfvwqbspyjktiapzlgorpmgayphd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137571.8408394-201-195728007846374/AnsiballZ_systemd.py'
Feb 26 20:26:12 compute-0 sudo[69633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:12 compute-0 python3.9[69636]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:26:12 compute-0 systemd[1]: Reloading.
Feb 26 20:26:12 compute-0 systemd-rc-local-generator[69667]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:12 compute-0 systemd-sysv-generator[69671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:12 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 26 20:26:13 compute-0 iptables.init[69683]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 26 20:26:13 compute-0 iptables.init[69683]: iptables: Flushing firewall rules: [  OK  ]
Feb 26 20:26:13 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 26 20:26:13 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 26 20:26:13 compute-0 sudo[69633]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:13 compute-0 sudo[69877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghuntbjdorsrmkuteyblpbppxzeqdpjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137573.2865615-201-91800731746816/AnsiballZ_systemd.py'
Feb 26 20:26:13 compute-0 sudo[69877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:13 compute-0 python3.9[69880]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:26:13 compute-0 sudo[69877]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:14 compute-0 sudo[70032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-furvglzisunpqcxzhehukmphgbnlyste ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137574.1559904-217-40953251843147/AnsiballZ_systemd.py'
Feb 26 20:26:14 compute-0 sudo[70032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:14 compute-0 python3.9[70035]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:26:14 compute-0 systemd[1]: Reloading.
Feb 26 20:26:14 compute-0 systemd-rc-local-generator[70067]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:26:14 compute-0 systemd-sysv-generator[70071]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:26:15 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 26 20:26:15 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 26 20:26:15 compute-0 sudo[70032]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:15 compute-0 sudo[70233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwxbecpkselpiddtrsnkzpaulbfwdaft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137575.2874053-225-193910042464775/AnsiballZ_command.py'
Feb 26 20:26:15 compute-0 sudo[70233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:15 compute-0 python3.9[70236]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:15 compute-0 sudo[70233]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:16 compute-0 sudo[70387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfqunnfrwnimzxljoyjxyuqodgglgkea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137576.2777302-239-126565469604408/AnsiballZ_stat.py'
Feb 26 20:26:16 compute-0 sudo[70387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:16 compute-0 python3.9[70390]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:16 compute-0 sudo[70387]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:17 compute-0 sudo[70513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hajpgfrfbpkrdfquqicfwcajmuzrbalo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137576.2777302-239-126565469604408/AnsiballZ_copy.py'
Feb 26 20:26:17 compute-0 sudo[70513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:17 compute-0 python3.9[70516]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137576.2777302-239-126565469604408/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:17 compute-0 sudo[70513]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:17 compute-0 sudo[70667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvvvurfebglahnatdbsaecccakdsmodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137577.4625838-254-251510863586958/AnsiballZ_systemd.py'
Feb 26 20:26:17 compute-0 sudo[70667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:18 compute-0 python3.9[70670]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:26:18 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 26 20:26:18 compute-0 sshd[1017]: Received SIGHUP; restarting.
Feb 26 20:26:18 compute-0 sshd[1017]: Server listening on 0.0.0.0 port 22.
Feb 26 20:26:18 compute-0 sshd[1017]: Server listening on :: port 22.
Feb 26 20:26:18 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 26 20:26:18 compute-0 sudo[70667]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:18 compute-0 sudo[70824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrmzrfbvmvzugerdlgptywzbxtzofzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137578.2507389-262-212056437815063/AnsiballZ_file.py'
Feb 26 20:26:18 compute-0 sudo[70824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:18 compute-0 python3.9[70827]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:18 compute-0 sudo[70824]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:19 compute-0 sudo[70977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfkyuoffvvhpwankvopvuogjgafpvpbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137578.840462-270-14806174825769/AnsiballZ_stat.py'
Feb 26 20:26:19 compute-0 sudo[70977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:19 compute-0 python3.9[70980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:19 compute-0 sudo[70977]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:19 compute-0 sudo[71101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbxbqohlwwygtebzfbwpsdoxxqsqnybc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137578.840462-270-14806174825769/AnsiballZ_copy.py'
Feb 26 20:26:19 compute-0 sudo[71101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:19 compute-0 python3.9[71104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137578.840462-270-14806174825769/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:19 compute-0 sudo[71101]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:20 compute-0 sudo[71254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmwwbcsaydeowpwoqcgpcticruynidrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137580.073734-288-36330254468353/AnsiballZ_timezone.py'
Feb 26 20:26:20 compute-0 sudo[71254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:20 compute-0 python3.9[71257]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 26 20:26:20 compute-0 systemd[1]: Starting Time & Date Service...
Feb 26 20:26:20 compute-0 systemd[1]: Started Time & Date Service.
Feb 26 20:26:20 compute-0 sudo[71254]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:21 compute-0 sudo[71411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsrnxbhrucdkdvffgfsjdanspbrgogtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137581.020832-297-99694847575469/AnsiballZ_file.py'
Feb 26 20:26:21 compute-0 sudo[71411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:21 compute-0 python3.9[71414]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:21 compute-0 sudo[71411]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:21 compute-0 sudo[71564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiaouzjjrowfuilhoomilouqzknagulb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137581.6699636-305-191723369105787/AnsiballZ_stat.py'
Feb 26 20:26:21 compute-0 sudo[71564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:22 compute-0 python3.9[71567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:22 compute-0 sudo[71564]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:22 compute-0 sudo[71688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfscphrbdmyqcqfqgctzzbewekwixuuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137581.6699636-305-191723369105787/AnsiballZ_copy.py'
Feb 26 20:26:22 compute-0 sudo[71688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:22 compute-0 python3.9[71691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137581.6699636-305-191723369105787/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:22 compute-0 sudo[71688]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:23 compute-0 sudo[71841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxcvvmbbexxvxrquxhtpqmvpognwuisi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137582.7959292-320-277316788346548/AnsiballZ_stat.py'
Feb 26 20:26:23 compute-0 sudo[71841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:23 compute-0 python3.9[71844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:23 compute-0 sudo[71841]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:23 compute-0 sudo[71965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gghxdqavfzcaoayxhepqmfferntsubeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137582.7959292-320-277316788346548/AnsiballZ_copy.py'
Feb 26 20:26:23 compute-0 sudo[71965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:23 compute-0 python3.9[71968]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137582.7959292-320-277316788346548/.source.yaml _original_basename=._vpsf8yx follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:23 compute-0 sudo[71965]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:24 compute-0 sudo[72118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfvlqnkawgygeipvanqdmftcjyyfqslx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137583.8256745-335-110314003699539/AnsiballZ_stat.py'
Feb 26 20:26:24 compute-0 sudo[72118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:24 compute-0 python3.9[72121]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:24 compute-0 sudo[72118]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:24 compute-0 sudo[72242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nszgjdekizkfmykbhmznckypbouliyvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137583.8256745-335-110314003699539/AnsiballZ_copy.py'
Feb 26 20:26:24 compute-0 sudo[72242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:24 compute-0 python3.9[72245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137583.8256745-335-110314003699539/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:24 compute-0 sudo[72242]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:25 compute-0 sudo[72395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faxitoynpsirffvnvgowehyaqryhziqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137584.9614246-350-43780994012082/AnsiballZ_command.py'
Feb 26 20:26:25 compute-0 sudo[72395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:25 compute-0 python3.9[72398]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:25 compute-0 sudo[72395]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:25 compute-0 sudo[72549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghgeioqojawvhpuagkvlnsmbzwynnmea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137585.5438597-358-91678307387267/AnsiballZ_command.py'
Feb 26 20:26:25 compute-0 sudo[72549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:25 compute-0 python3.9[72552]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:26 compute-0 sudo[72549]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:26 compute-0 sudo[72703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awimrsjurjfammrnvzdodyuwbvxgzwti ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772137586.1691277-366-257494817410363/AnsiballZ_edpm_nftables_from_files.py'
Feb 26 20:26:26 compute-0 sudo[72703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:26 compute-0 python3[72706]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 26 20:26:26 compute-0 sudo[72703]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:27 compute-0 sudo[72856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnzzknvlrvzxjaqkdkjugqzjxiztvpmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137586.9503415-374-102750329106308/AnsiballZ_stat.py'
Feb 26 20:26:27 compute-0 sudo[72856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:27 compute-0 python3.9[72859]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:27 compute-0 sudo[72856]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:27 compute-0 sudo[72980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjixkjfuowlsfqlwnejodapjupzavago ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137586.9503415-374-102750329106308/AnsiballZ_copy.py'
Feb 26 20:26:27 compute-0 sudo[72980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:28 compute-0 python3.9[72983]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137586.9503415-374-102750329106308/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:28 compute-0 sudo[72980]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:28 compute-0 sudo[73133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwzqjcoxeeugtrzfemxvywrjnvrzwbdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137588.1687486-389-278953147094593/AnsiballZ_stat.py'
Feb 26 20:26:28 compute-0 sudo[73133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:28 compute-0 python3.9[73136]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:28 compute-0 sudo[73133]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:29 compute-0 sudo[73257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tolhgnogpvkkcudguueyjbqozqonsaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137588.1687486-389-278953147094593/AnsiballZ_copy.py'
Feb 26 20:26:29 compute-0 sudo[73257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:29 compute-0 python3.9[73260]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137588.1687486-389-278953147094593/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:29 compute-0 sudo[73257]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:29 compute-0 sudo[73410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blwadaawmaoytmgfedowpmozppxtrbbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137589.3826737-404-737052491579/AnsiballZ_stat.py'
Feb 26 20:26:29 compute-0 sudo[73410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:29 compute-0 python3.9[73413]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:29 compute-0 sudo[73410]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:30 compute-0 sudo[73534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqsdohgoqrkjwfuzrddlidmnqrhlhekl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137589.3826737-404-737052491579/AnsiballZ_copy.py'
Feb 26 20:26:30 compute-0 sudo[73534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:30 compute-0 python3.9[73537]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137589.3826737-404-737052491579/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:30 compute-0 sudo[73534]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:30 compute-0 sudo[73687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owfootyxsekowrqgftjrxzufsovvuald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137590.4942346-419-193338902798969/AnsiballZ_stat.py'
Feb 26 20:26:30 compute-0 sudo[73687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:30 compute-0 python3.9[73690]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:30 compute-0 sudo[73687]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:31 compute-0 sudo[73811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjonzupnibzbkcyfktndgcjjmgkinryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137590.4942346-419-193338902798969/AnsiballZ_copy.py'
Feb 26 20:26:31 compute-0 sudo[73811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:31 compute-0 python3.9[73814]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137590.4942346-419-193338902798969/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:31 compute-0 sudo[73811]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:31 compute-0 sudo[73964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unyukkacgpqmxdcfynvmcbkeljbwbaof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137591.587904-434-39931757280019/AnsiballZ_stat.py'
Feb 26 20:26:31 compute-0 sudo[73964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:32 compute-0 python3.9[73967]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:26:32 compute-0 sudo[73964]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:32 compute-0 sudo[74088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkbcmpixpeymtlwtxxoasaldqziqbjcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137591.587904-434-39931757280019/AnsiballZ_copy.py'
Feb 26 20:26:32 compute-0 sudo[74088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:32 compute-0 python3.9[74091]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137591.587904-434-39931757280019/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:32 compute-0 sudo[74088]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:32 compute-0 sudo[74241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horlrnaghawwtrtobaparknxswxxzvqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137592.7557375-449-226161346417367/AnsiballZ_file.py'
Feb 26 20:26:32 compute-0 sudo[74241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:33 compute-0 python3.9[74244]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:33 compute-0 sudo[74241]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:33 compute-0 sudo[74394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfuladxcfhqxcfvraerxpyjltgdtalxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137593.3525455-457-157912637905308/AnsiballZ_command.py'
Feb 26 20:26:33 compute-0 sudo[74394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:33 compute-0 python3.9[74397]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:33 compute-0 sudo[74394]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:34 compute-0 sudo[74554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftnubfiddtlwichubezcegdqzpjrvmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137594.062486-465-63212052722964/AnsiballZ_blockinfile.py'
Feb 26 20:26:34 compute-0 sudo[74554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:35 compute-0 python3.9[74557]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:35 compute-0 sudo[74554]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:35 compute-0 sudo[74708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwonvaobaxvpecpnhdpomvddkubnbtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137595.345691-474-186372759039344/AnsiballZ_file.py'
Feb 26 20:26:35 compute-0 sudo[74708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:35 compute-0 python3.9[74711]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:35 compute-0 sudo[74708]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:36 compute-0 sudo[74861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-holhoawxzrdhfhzyfcqdhkjscszfjfxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137595.9735177-474-48948404237746/AnsiballZ_file.py'
Feb 26 20:26:36 compute-0 sudo[74861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:36 compute-0 python3.9[74864]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:36 compute-0 sudo[74861]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:37 compute-0 sudo[75014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skuusxtfbdlbaucmobopoflvnynvncwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137596.6054077-489-1372212130880/AnsiballZ_mount.py'
Feb 26 20:26:37 compute-0 sudo[75014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:37 compute-0 python3.9[75017]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 26 20:26:37 compute-0 sudo[75014]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:37 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:26:37 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:26:37 compute-0 sudo[75169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayobnnprpbxwtaphfslbhdajipvgokkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137597.4272907-489-26563478651041/AnsiballZ_mount.py'
Feb 26 20:26:37 compute-0 sudo[75169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:37 compute-0 python3.9[75172]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 26 20:26:37 compute-0 sudo[75169]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:38 compute-0 sshd-session[65910]: Connection closed by 192.168.122.30 port 48064
Feb 26 20:26:38 compute-0 sshd-session[65907]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:26:38 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 26 20:26:38 compute-0 systemd-logind[825]: Session 14 logged out. Waiting for processes to exit.
Feb 26 20:26:38 compute-0 systemd[1]: session-14.scope: Consumed 32.925s CPU time.
Feb 26 20:26:38 compute-0 systemd-logind[825]: Removed session 14.
Feb 26 20:26:43 compute-0 sshd-session[75198]: Accepted publickey for zuul from 192.168.122.30 port 56090 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:26:43 compute-0 systemd-logind[825]: New session 15 of user zuul.
Feb 26 20:26:43 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 26 20:26:43 compute-0 sshd-session[75198]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:26:43 compute-0 sudo[75351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eounmbbdiklmxmmjcovctrdfyqlzfmjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137603.5358257-16-14076995616677/AnsiballZ_tempfile.py'
Feb 26 20:26:43 compute-0 sudo[75351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:44 compute-0 python3.9[75354]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 26 20:26:44 compute-0 sudo[75351]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:44 compute-0 sudo[75504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meqylwrpcpffrpurawwssudgbtjamcop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137604.35383-28-274848235398769/AnsiballZ_stat.py'
Feb 26 20:26:44 compute-0 sudo[75504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:44 compute-0 python3.9[75507]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:26:44 compute-0 sudo[75504]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:45 compute-0 sudo[75657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibptnjljmgotgasieuozwmxlsjzezxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137605.1455781-38-276984423866197/AnsiballZ_setup.py'
Feb 26 20:26:45 compute-0 sudo[75657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:46 compute-0 python3.9[75660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:26:46 compute-0 sudo[75657]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:46 compute-0 sudo[75810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibjjbnwgjcbvxralwhpwwahauaxnenez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137606.3027186-47-280268973841432/AnsiballZ_blockinfile.py'
Feb 26 20:26:46 compute-0 sudo[75810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:46 compute-0 python3.9[75813]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzHp2I7/DncQkDLKg8xyGj/7tXOpRhU7XPlJDWfNRyOZHtp9a0cYQcLB+P5il7gQ5cbgXMpl/BjspuFU4sdG/pCqoG8XabexP8JO0ystqH7oSgRklR9PzhMcG5b1QyiKTot7lsoJCtLANjMIL05iTBEmJUhN61+HOcOpbacRHg7NemR46hJ7b9tbLYB8Aq8tg9YePq4skZEKUhquBAGRGxpibPd8KvtKkUw2o3+qlgAARMkJ1iOZBgHgboMoEHfAh+MTg1paBG8C2JxTlwH21eS2x8BmY2nnzGAeirx+KUBZcURtdLpALI9UOLWw1uN6fBp/QIqdPdc5Eh+3H4LzUfcyNPrPhGKpw4FKjJL76UEClV33BGPYK+/3cG8Dwz22PLg1tCuNpZaeb4u4Z5jj36UocWV2+c6+Nj6cBWtx/9cje/DWA/nxfMQri1ujKGExWMAHIXbvVPwElgiSjeVQQKYNd8ZHU/f5mVaCFNOnsa9pKAWVuqTZ4EoPUcFsHNyU8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKT/GomfxotbuRzU/v+77oVVNchryXkFnKoSk/K8p8ub
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9mAg5mPgk8b35BfD2JsKn/PtrcnyDNoQ7vVXsvSDeNKdQxtOraD/wju5+EbJ5Y6I1xo53E4qDM4C5ogeE3pzs=
                                             create=True mode=0644 path=/tmp/ansible.3jv8ca27 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:46 compute-0 sudo[75810]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:47 compute-0 sudo[75963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-medhsyakghswyqithceyezwqrufhqpro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137607.1406515-55-274514366269713/AnsiballZ_command.py'
Feb 26 20:26:47 compute-0 sudo[75963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:47 compute-0 python3.9[75966]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.3jv8ca27' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:47 compute-0 sudo[75963]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:48 compute-0 sudo[76118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeyojsphxarwaurfqslzopchujmmmnoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137607.8153791-63-196153143806913/AnsiballZ_file.py'
Feb 26 20:26:48 compute-0 sudo[76118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:48 compute-0 python3.9[76121]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.3jv8ca27 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:26:48 compute-0 sudo[76118]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:48 compute-0 sshd-session[75201]: Connection closed by 192.168.122.30 port 56090
Feb 26 20:26:48 compute-0 sshd-session[75198]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:26:48 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 26 20:26:48 compute-0 systemd[1]: session-15.scope: Consumed 3.192s CPU time.
Feb 26 20:26:48 compute-0 systemd-logind[825]: Session 15 logged out. Waiting for processes to exit.
Feb 26 20:26:48 compute-0 systemd-logind[825]: Removed session 15.
Feb 26 20:26:50 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 26 20:26:54 compute-0 sshd-session[76148]: Accepted publickey for zuul from 192.168.122.30 port 41998 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:26:54 compute-0 systemd-logind[825]: New session 16 of user zuul.
Feb 26 20:26:54 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 26 20:26:54 compute-0 sshd-session[76148]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:26:55 compute-0 python3.9[76301]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:26:56 compute-0 sudo[76455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agcaxlbzphcqzwfrbaqlzclcfkenvuaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137615.8475714-27-43589566143985/AnsiballZ_systemd.py'
Feb 26 20:26:56 compute-0 sudo[76455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:56 compute-0 python3.9[76458]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 26 20:26:56 compute-0 sudo[76455]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:57 compute-0 sudo[76610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaulvwvnnnxedhuuzmrfwndsvitsrdbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137616.883651-35-69355525837962/AnsiballZ_systemd.py'
Feb 26 20:26:57 compute-0 sudo[76610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:57 compute-0 python3.9[76613]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:26:58 compute-0 sudo[76610]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:59 compute-0 sudo[76764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cenqmvkrsnolctkvqzvrmpvqteprmguo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137618.707786-44-176518827213732/AnsiballZ_command.py'
Feb 26 20:26:59 compute-0 sudo[76764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:26:59 compute-0 python3.9[76767]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:26:59 compute-0 sudo[76764]: pam_unix(sudo:session): session closed for user root
Feb 26 20:26:59 compute-0 sudo[76918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbcbaanmbyozubypcujqshrpmojxcnoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137619.467399-52-42156614031711/AnsiballZ_stat.py'
Feb 26 20:26:59 compute-0 sudo[76918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:00 compute-0 python3.9[76921]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:27:00 compute-0 sudo[76918]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:00 compute-0 sudo[77073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjolbosmsqeldprbldnjyyipygbvjelv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137620.2407815-60-48815835344974/AnsiballZ_command.py'
Feb 26 20:27:00 compute-0 sudo[77073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:00 compute-0 python3.9[77076]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:27:00 compute-0 sudo[77073]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:01 compute-0 sudo[77229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vittkifqbuohuxbdeneyoijzaqjvynwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137620.9052982-68-149515658107459/AnsiballZ_file.py'
Feb 26 20:27:01 compute-0 sudo[77229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:02 compute-0 python3.9[77232]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:02 compute-0 sudo[77229]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:02 compute-0 sshd-session[76151]: Connection closed by 192.168.122.30 port 41998
Feb 26 20:27:02 compute-0 sshd-session[76148]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:27:02 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 26 20:27:02 compute-0 systemd[1]: session-16.scope: Consumed 4.234s CPU time.
Feb 26 20:27:02 compute-0 systemd-logind[825]: Session 16 logged out. Waiting for processes to exit.
Feb 26 20:27:02 compute-0 systemd-logind[825]: Removed session 16.
Feb 26 20:27:07 compute-0 sshd-session[77257]: Accepted publickey for zuul from 192.168.122.30 port 44892 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:27:07 compute-0 systemd-logind[825]: New session 17 of user zuul.
Feb 26 20:27:07 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 26 20:27:07 compute-0 sshd-session[77257]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:27:08 compute-0 python3.9[77410]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:27:09 compute-0 sudo[77564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywogszbqzeuvbwgihebrrroseowbendm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137628.8593142-29-276218531492471/AnsiballZ_setup.py'
Feb 26 20:27:09 compute-0 sudo[77564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:09 compute-0 python3.9[77567]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:27:09 compute-0 sudo[77564]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:10 compute-0 sudo[77649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmizumifpswsrwxqrsbhmjflysptzqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137628.8593142-29-276218531492471/AnsiballZ_dnf.py'
Feb 26 20:27:10 compute-0 sudo[77649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:10 compute-0 python3.9[77652]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 26 20:27:11 compute-0 sudo[77649]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:12 compute-0 python3.9[77803]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:27:13 compute-0 python3.9[77954]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:27:14 compute-0 python3.9[78104]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:27:14 compute-0 python3.9[78254]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:27:15 compute-0 sshd-session[77260]: Connection closed by 192.168.122.30 port 44892
Feb 26 20:27:15 compute-0 sshd-session[77257]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:27:15 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 26 20:27:15 compute-0 systemd[1]: session-17.scope: Consumed 5.396s CPU time.
Feb 26 20:27:15 compute-0 systemd-logind[825]: Session 17 logged out. Waiting for processes to exit.
Feb 26 20:27:15 compute-0 systemd-logind[825]: Removed session 17.
Feb 26 20:27:20 compute-0 sshd-session[78279]: Accepted publickey for zuul from 192.168.122.30 port 46492 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:27:20 compute-0 systemd-logind[825]: New session 18 of user zuul.
Feb 26 20:27:20 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 26 20:27:20 compute-0 sshd-session[78279]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:27:21 compute-0 python3.9[78432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:27:23 compute-0 sudo[78586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbpsvkyvosfpclaoqmbskljldyiwhwaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137642.6817353-45-95242143011936/AnsiballZ_file.py'
Feb 26 20:27:23 compute-0 sudo[78586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:23 compute-0 python3.9[78589]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:23 compute-0 sudo[78586]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:23 compute-0 sudo[78739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyfpqiynplfmcuatkchisypnaewnflvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137643.3685052-45-144993702108881/AnsiballZ_file.py'
Feb 26 20:27:23 compute-0 sudo[78739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:23 compute-0 python3.9[78742]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:23 compute-0 sudo[78739]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:24 compute-0 sudo[78892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thegbgeybfcvbeepibwmgvvvarniblye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137643.8838277-60-72175680602111/AnsiballZ_stat.py'
Feb 26 20:27:24 compute-0 sudo[78892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:24 compute-0 python3.9[78895]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:24 compute-0 sudo[78892]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:24 compute-0 sudo[79016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljydidtpgbrvjrwednxyioxvklhuiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137643.8838277-60-72175680602111/AnsiballZ_copy.py'
Feb 26 20:27:24 compute-0 sudo[79016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:25 compute-0 python3.9[79019]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137643.8838277-60-72175680602111/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=0d87b9d3e4e732c5c999f6f6a93ce2ada202550d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:25 compute-0 sudo[79016]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:25 compute-0 sudo[79169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlnvzmxodgcikdrlzgmyfehcwaymkcpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137645.241229-60-222009914108790/AnsiballZ_stat.py'
Feb 26 20:27:25 compute-0 sudo[79169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:25 compute-0 python3.9[79172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:25 compute-0 sudo[79169]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:26 compute-0 sudo[79293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdhtzfghfgerfsfmhozddkpluqdgmxxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137645.241229-60-222009914108790/AnsiballZ_copy.py'
Feb 26 20:27:26 compute-0 sudo[79293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:26 compute-0 python3.9[79296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137645.241229-60-222009914108790/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=053dd63fe6b3e6abd86c253156025327312512bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:26 compute-0 sudo[79293]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:26 compute-0 sudo[79446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cveookqcptahxjqrzuflxekvshymcjfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137646.3507197-60-89073603818115/AnsiballZ_stat.py'
Feb 26 20:27:26 compute-0 sudo[79446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:26 compute-0 python3.9[79449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:26 compute-0 sudo[79446]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:27 compute-0 sudo[79570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibsooyinoqchihoujehbxwjbdkkybihu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137646.3507197-60-89073603818115/AnsiballZ_copy.py'
Feb 26 20:27:27 compute-0 sudo[79570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:27 compute-0 python3.9[79573]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137646.3507197-60-89073603818115/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=83f5f2dc4d2d4097a2f92ff2dfac2d68db009c1c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:27 compute-0 sudo[79570]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:27 compute-0 sudo[79723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pktupavnlybjkezmejyzwahoytbynarz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137647.460729-104-169112361265269/AnsiballZ_file.py'
Feb 26 20:27:27 compute-0 sudo[79723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:27 compute-0 python3.9[79726]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:28 compute-0 sudo[79723]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:28 compute-0 sudo[79876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwozziijbvqpilcufnkfpnqczeewgdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137648.1665404-104-122391159652901/AnsiballZ_file.py'
Feb 26 20:27:28 compute-0 sudo[79876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:28 compute-0 python3.9[79879]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:28 compute-0 sudo[79876]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:28 compute-0 sudo[80029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqmjwjepcigujhrbgxnvucwknieezsja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137648.7434998-119-88023694142327/AnsiballZ_stat.py'
Feb 26 20:27:28 compute-0 sudo[80029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:29 compute-0 python3.9[80032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:29 compute-0 sudo[80029]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:29 compute-0 sudo[80153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsavgcfwamykkbyqbmakmfzewqywemir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137648.7434998-119-88023694142327/AnsiballZ_copy.py'
Feb 26 20:27:29 compute-0 sudo[80153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:29 compute-0 python3.9[80156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137648.7434998-119-88023694142327/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=20abf8f6566ca756584249817a6f52f781487a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:29 compute-0 sudo[80153]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:30 compute-0 sudo[80306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlywkwelzjkretijjmgaktuuhwsipxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137649.7962468-119-100635689921756/AnsiballZ_stat.py'
Feb 26 20:27:30 compute-0 sudo[80306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:30 compute-0 python3.9[80309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:30 compute-0 sudo[80306]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:30 compute-0 sudo[80430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxxbyasyamdndaifwvhrimldrsfldgwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137649.7962468-119-100635689921756/AnsiballZ_copy.py'
Feb 26 20:27:30 compute-0 sudo[80430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:30 compute-0 python3.9[80433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137649.7962468-119-100635689921756/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=417fea1d7eaa12505c48d0759c50949f61c454b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:30 compute-0 sudo[80430]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:30 compute-0 sudo[80583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkfcvwmkddiuiyqracbqotigyhegjxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137650.7374914-119-48804790478420/AnsiballZ_stat.py'
Feb 26 20:27:30 compute-0 sudo[80583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:31 compute-0 python3.9[80586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:31 compute-0 sudo[80583]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:31 compute-0 sudo[80707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejxfmryntfrbkzyqfpxlqemigielglft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137650.7374914-119-48804790478420/AnsiballZ_copy.py'
Feb 26 20:27:31 compute-0 sudo[80707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:31 compute-0 python3.9[80710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137650.7374914-119-48804790478420/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=eeb7df450e7beeef74e340be1ec555f3ff66c746 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:31 compute-0 sudo[80707]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:31 compute-0 sudo[80860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wspwwlmcrfpitsvcbtkxymtzxxaogsjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137651.7256484-163-105799766967297/AnsiballZ_file.py'
Feb 26 20:27:31 compute-0 sudo[80860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:32 compute-0 python3.9[80863]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:32 compute-0 sudo[80860]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:32 compute-0 sudo[81013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxwtehnlwwgrycsrwijawofdrzcdkrbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137652.240418-163-48068782995603/AnsiballZ_file.py'
Feb 26 20:27:32 compute-0 sudo[81013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:32 compute-0 python3.9[81016]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:32 compute-0 sudo[81013]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:33 compute-0 sudo[81166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anfkxsjcadwwpgasifuvyqrlwoxjmcut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137652.8454392-178-235149918503945/AnsiballZ_stat.py'
Feb 26 20:27:33 compute-0 sudo[81166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:33 compute-0 python3.9[81169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:33 compute-0 sudo[81166]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:33 compute-0 sudo[81290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vojzawabewpxdkfhsopwwtlbpizqjwfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137652.8454392-178-235149918503945/AnsiballZ_copy.py'
Feb 26 20:27:33 compute-0 sudo[81290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:33 compute-0 python3.9[81293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137652.8454392-178-235149918503945/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e8ca17ffd3f283eb74a785dbea080fc27ec3c54f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:33 compute-0 sudo[81290]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:34 compute-0 sudo[81443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hasljsqtksewuhjckynywsyrbzvokonl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137653.8612027-178-237993423846898/AnsiballZ_stat.py'
Feb 26 20:27:34 compute-0 sudo[81443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:34 compute-0 python3.9[81446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:34 compute-0 sudo[81443]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:34 compute-0 sudo[81567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anmfzbfhotoqlnipdrjqqwbbywdhpmdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137653.8612027-178-237993423846898/AnsiballZ_copy.py'
Feb 26 20:27:34 compute-0 sudo[81567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:34 compute-0 python3.9[81570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137653.8612027-178-237993423846898/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f15f3729758aa87d7c3bbd28575bf8a1e045bec3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:34 compute-0 sudo[81567]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:35 compute-0 sudo[81720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlkmvxdppnwycklvsuiqpgljqgbdmzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137654.8095405-178-171206848330833/AnsiballZ_stat.py'
Feb 26 20:27:35 compute-0 sudo[81720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:35 compute-0 python3.9[81723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:35 compute-0 sudo[81720]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:35 compute-0 sudo[81844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcrzhltfkwldebflfddkmtzwvfonvgui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137654.8095405-178-171206848330833/AnsiballZ_copy.py'
Feb 26 20:27:35 compute-0 sudo[81844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:35 compute-0 python3.9[81847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137654.8095405-178-171206848330833/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2bbd69d16cd419213569ac20e2b489451b7690ca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:35 compute-0 sudo[81844]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:36 compute-0 sudo[81997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqmqjydufywaesxdwgqjhjtcmwwtjcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137655.891556-222-181325062509262/AnsiballZ_file.py'
Feb 26 20:27:36 compute-0 sudo[81997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:36 compute-0 python3.9[82000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:36 compute-0 sudo[81997]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:36 compute-0 sudo[82150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brrncenkbzthzqeszibtkegfpslyhmki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137656.4014304-222-117427396873243/AnsiballZ_file.py'
Feb 26 20:27:36 compute-0 sudo[82150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:36 compute-0 python3.9[82153]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:36 compute-0 sudo[82150]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:37 compute-0 sudo[82303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yailmopskpjooywbnbcuhgepomfwltos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137656.9677615-237-202825279993142/AnsiballZ_stat.py'
Feb 26 20:27:37 compute-0 sudo[82303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:37 compute-0 python3.9[82306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:37 compute-0 sudo[82303]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:37 compute-0 sudo[82427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csdwiritpytwtkwriymqmtavwmurkcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137656.9677615-237-202825279993142/AnsiballZ_copy.py'
Feb 26 20:27:37 compute-0 sudo[82427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:37 compute-0 python3.9[82430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137656.9677615-237-202825279993142/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=875ba77b4d413af1e3f8bac0c1a4aa3fbc2fac1c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:37 compute-0 sudo[82427]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:38 compute-0 sudo[82580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgqufdhpjdekfddqxbcausbpxojwvprj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137658.0575125-237-261681448590153/AnsiballZ_stat.py'
Feb 26 20:27:38 compute-0 sudo[82580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:38 compute-0 python3.9[82583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:38 compute-0 sudo[82580]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:38 compute-0 sudo[82704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bceizfdrcryrmopxiveumyzettqebkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137658.0575125-237-261681448590153/AnsiballZ_copy.py'
Feb 26 20:27:38 compute-0 sudo[82704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:38 compute-0 python3.9[82707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137658.0575125-237-261681448590153/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f15f3729758aa87d7c3bbd28575bf8a1e045bec3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:39 compute-0 sudo[82704]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:39 compute-0 sudo[82857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsubuqhnjopqodifwtgxqobpkwfjstwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137659.113372-237-16945548970860/AnsiballZ_stat.py'
Feb 26 20:27:39 compute-0 sudo[82857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:39 compute-0 python3.9[82860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:39 compute-0 sudo[82857]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:39 compute-0 sudo[82981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlcughcmkxugivswhrhnlufugklrhusq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137659.113372-237-16945548970860/AnsiballZ_copy.py'
Feb 26 20:27:39 compute-0 sudo[82981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:39 compute-0 python3.9[82984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137659.113372-237-16945548970860/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a51ba86223dc444d933b0ebc9432263b86917153 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:39 compute-0 sudo[82981]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:40 compute-0 sudo[83134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiaijumpsayywyauehjhgdcdyinikcij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137660.64727-297-15300874048581/AnsiballZ_file.py'
Feb 26 20:27:40 compute-0 sudo[83134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:41 compute-0 python3.9[83137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:41 compute-0 sudo[83134]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:41 compute-0 sudo[83287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmpcgdeucfxbanggqgqgrkcktoudamhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137661.2665043-305-9402902062875/AnsiballZ_stat.py'
Feb 26 20:27:41 compute-0 sudo[83287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:41 compute-0 python3.9[83290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:41 compute-0 sudo[83287]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:42 compute-0 sudo[83411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnikileejdntxssuwzmlnbgrrhwmijye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137661.2665043-305-9402902062875/AnsiballZ_copy.py'
Feb 26 20:27:42 compute-0 sudo[83411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:42 compute-0 python3.9[83414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137661.2665043-305-9402902062875/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:42 compute-0 sudo[83411]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:42 compute-0 sudo[83564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbifycfjbwjzgowzylcajhanxlgmibwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137662.5901103-321-81235490186963/AnsiballZ_file.py'
Feb 26 20:27:42 compute-0 sudo[83564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:43 compute-0 python3.9[83567]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:43 compute-0 sudo[83564]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:43 compute-0 sudo[83717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqusnqxbtlypvjckayvyczonmqglkna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137663.2303329-329-271986440638872/AnsiballZ_stat.py'
Feb 26 20:27:43 compute-0 sudo[83717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:43 compute-0 python3.9[83720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:43 compute-0 sudo[83717]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:43 compute-0 sudo[83841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfzhzcqbsorhunvhxetesjlprdvldrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137663.2303329-329-271986440638872/AnsiballZ_copy.py'
Feb 26 20:27:43 compute-0 sudo[83841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:44 compute-0 python3.9[83844]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137663.2303329-329-271986440638872/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:44 compute-0 sudo[83841]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:44 compute-0 sudo[83994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pceytrnfqnnrnlyxsgrjmsbnhtyggmzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137664.386303-345-54741554825837/AnsiballZ_file.py'
Feb 26 20:27:44 compute-0 sudo[83994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:44 compute-0 python3.9[83997]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:44 compute-0 sudo[83994]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:45 compute-0 sudo[84147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnewqihnmsxatizzwjfkuefzpbijnomy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137665.0955372-353-31306209344474/AnsiballZ_stat.py'
Feb 26 20:27:45 compute-0 sudo[84147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:45 compute-0 python3.9[84150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:45 compute-0 sudo[84147]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:45 compute-0 sudo[84271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqbfvcgtqxerjnfalozkvitpywtemzpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137665.0955372-353-31306209344474/AnsiballZ_copy.py'
Feb 26 20:27:45 compute-0 sudo[84271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:46 compute-0 python3.9[84274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137665.0955372-353-31306209344474/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:46 compute-0 sudo[84271]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:46 compute-0 sudo[84424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwdwbuxrljpniaacdsxpkibwsbuwkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137666.4461136-369-69807209223224/AnsiballZ_file.py'
Feb 26 20:27:46 compute-0 sudo[84424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:46 compute-0 python3.9[84427]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:46 compute-0 sudo[84424]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:47 compute-0 sudo[84577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lobjdrntsusdeklipotueskowntgzggc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137667.0124686-377-184601854302164/AnsiballZ_stat.py'
Feb 26 20:27:47 compute-0 sudo[84577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:47 compute-0 python3.9[84580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:47 compute-0 sudo[84577]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:47 compute-0 sudo[84701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtdxwjvjlamjzcxntbymnleuyqrtnaem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137667.0124686-377-184601854302164/AnsiballZ_copy.py'
Feb 26 20:27:47 compute-0 sudo[84701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:47 compute-0 python3.9[84704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137667.0124686-377-184601854302164/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:48 compute-0 sudo[84701]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:48 compute-0 sudo[84854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfbqjgmvntqagbquqxvvzgwnfqytxuum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137668.205384-393-59945523954720/AnsiballZ_file.py'
Feb 26 20:27:48 compute-0 sudo[84854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:48 compute-0 python3.9[84857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:48 compute-0 sudo[84854]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:49 compute-0 sudo[85007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbnbafnpctecjyucwsxpfcgbfumbtxjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137668.8372352-401-129695930354309/AnsiballZ_stat.py'
Feb 26 20:27:49 compute-0 sudo[85007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:49 compute-0 python3.9[85010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:49 compute-0 sudo[85007]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:49 compute-0 sudo[85131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkovvvgkwrudqhhcegbcfikanlqbppbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137668.8372352-401-129695930354309/AnsiballZ_copy.py'
Feb 26 20:27:49 compute-0 sudo[85131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:49 compute-0 python3.9[85134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137668.8372352-401-129695930354309/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:49 compute-0 sudo[85131]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:50 compute-0 sudo[85284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frovjxdqazswreojnrtduxtklputcnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137670.1208668-417-212953653312932/AnsiballZ_file.py'
Feb 26 20:27:50 compute-0 sudo[85284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:50 compute-0 python3.9[85287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:50 compute-0 sudo[85284]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:51 compute-0 sudo[85437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrvexjvqqdrurkxckfsyaqhtpvdvpbdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137670.8496482-425-233462114209145/AnsiballZ_stat.py'
Feb 26 20:27:51 compute-0 sudo[85437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:51 compute-0 python3.9[85440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:51 compute-0 sudo[85437]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:51 compute-0 sudo[85561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtqefgvfmmddmlalbsdsvoaarhwedsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137670.8496482-425-233462114209145/AnsiballZ_copy.py'
Feb 26 20:27:51 compute-0 sudo[85561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:51 compute-0 python3.9[85564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137670.8496482-425-233462114209145/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:51 compute-0 sudo[85561]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:52 compute-0 sudo[85714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkyvhozruobtnhumpqpffjydimygdftr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137672.0255463-441-206654485059857/AnsiballZ_file.py'
Feb 26 20:27:52 compute-0 sudo[85714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:52 compute-0 python3.9[85717]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:27:52 compute-0 sudo[85714]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:52 compute-0 sudo[85867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmlvszqkltfjigbeqggccjozxcsgiohw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137672.570795-449-189643745732032/AnsiballZ_stat.py'
Feb 26 20:27:52 compute-0 sudo[85867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:53 compute-0 python3.9[85870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:27:53 compute-0 sudo[85867]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:53 compute-0 sudo[85991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bytyzaamhxxzkbnfzuinpkqopsikhnbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137672.570795-449-189643745732032/AnsiballZ_copy.py'
Feb 26 20:27:53 compute-0 sudo[85991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:27:53 compute-0 python3.9[85994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137672.570795-449-189643745732032/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=853e5e1f50ff9ad65bb5b0720c7733e0aa47f6bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:27:53 compute-0 sudo[85991]: pam_unix(sudo:session): session closed for user root
Feb 26 20:27:53 compute-0 sshd-session[78282]: Connection closed by 192.168.122.30 port 46492
Feb 26 20:27:53 compute-0 sshd-session[78279]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:27:53 compute-0 systemd-logind[825]: Session 18 logged out. Waiting for processes to exit.
Feb 26 20:27:53 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 26 20:27:53 compute-0 systemd[1]: session-18.scope: Consumed 25.457s CPU time.
Feb 26 20:27:53 compute-0 systemd-logind[825]: Removed session 18.
Feb 26 20:27:57 compute-0 chronyd[65881]: Selected source 209.227.173.244 (pool.ntp.org)
Feb 26 20:27:59 compute-0 sshd-session[86019]: Accepted publickey for zuul from 192.168.122.30 port 52328 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:27:59 compute-0 systemd-logind[825]: New session 19 of user zuul.
Feb 26 20:27:59 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 26 20:27:59 compute-0 sshd-session[86019]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:28:00 compute-0 python3.9[86172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:28:00 compute-0 sudo[86326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frqgytlsnpfjrassbjkflturhihvjupn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137680.597241-29-107287895261136/AnsiballZ_file.py'
Feb 26 20:28:00 compute-0 sudo[86326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:01 compute-0 python3.9[86329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:01 compute-0 sudo[86326]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:01 compute-0 sudo[86479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drgkogkywwqllbfxdlvvhubyseclorhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137681.3193882-29-69917383055244/AnsiballZ_file.py'
Feb 26 20:28:01 compute-0 sudo[86479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:01 compute-0 python3.9[86482]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:01 compute-0 sudo[86479]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:02 compute-0 python3.9[86632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:28:02 compute-0 sudo[86782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afycpdcpcbwvvzwvbekqyhhjnktjjzrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137682.6295626-52-122232565537947/AnsiballZ_seboolean.py'
Feb 26 20:28:02 compute-0 sudo[86782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:03 compute-0 python3.9[86785]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 26 20:28:04 compute-0 sudo[86782]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:04 compute-0 sudo[86939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgltvspnnfcxciggphkcjojzoxtqtine ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137684.5070016-62-172579747581382/AnsiballZ_setup.py'
Feb 26 20:28:04 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 26 20:28:04 compute-0 sudo[86939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:05 compute-0 python3.9[86942]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:28:05 compute-0 sudo[86939]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:05 compute-0 sudo[87024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhrkzjaucrqbwwszqgtwtotdgdcsirvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137684.5070016-62-172579747581382/AnsiballZ_dnf.py'
Feb 26 20:28:05 compute-0 sudo[87024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:05 compute-0 python3.9[87027]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:28:07 compute-0 sudo[87024]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:07 compute-0 sudo[87178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxvmbabwxuztnejkazdbwdwuwnsxdpri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137687.2159863-74-101468723216437/AnsiballZ_systemd.py'
Feb 26 20:28:07 compute-0 sudo[87178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:07 compute-0 python3.9[87181]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:28:08 compute-0 sudo[87178]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:08 compute-0 sudo[87334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orbveroidnasfqyyvfgnolghtssoeuqi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772137688.2128034-82-119067195136189/AnsiballZ_edpm_nftables_snippet.py'
Feb 26 20:28:08 compute-0 sudo[87334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:08 compute-0 python3[87337]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 26 20:28:08 compute-0 sudo[87334]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:09 compute-0 sudo[87487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llnabikzzbhweluliorhyapcqdwwijai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137689.0219157-91-39555167363682/AnsiballZ_file.py'
Feb 26 20:28:09 compute-0 sudo[87487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:09 compute-0 python3.9[87490]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:09 compute-0 sudo[87487]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:09 compute-0 sudo[87640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkvmujuzxejzepolycmarthzytvkmqia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137689.6061597-99-196014618066235/AnsiballZ_stat.py'
Feb 26 20:28:09 compute-0 sudo[87640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:10 compute-0 python3.9[87643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:10 compute-0 sudo[87640]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:10 compute-0 sudo[87719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjqwlscwixmjxtuobjcdydvkxnogbaqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137689.6061597-99-196014618066235/AnsiballZ_file.py'
Feb 26 20:28:10 compute-0 sudo[87719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:10 compute-0 python3.9[87722]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:10 compute-0 sudo[87719]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:11 compute-0 sudo[87872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blrpmaiexjmnahsswjyysavujeogpqpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137690.7988508-111-88286649880496/AnsiballZ_stat.py'
Feb 26 20:28:11 compute-0 sudo[87872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:11 compute-0 python3.9[87875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:11 compute-0 sudo[87872]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:11 compute-0 sudo[87951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vakxugvjebwviizejkufjteafiytnniy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137690.7988508-111-88286649880496/AnsiballZ_file.py'
Feb 26 20:28:11 compute-0 sudo[87951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:11 compute-0 python3.9[87954]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.n1we61t2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:11 compute-0 sudo[87951]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:12 compute-0 sudo[88104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbpkohlixivzyrhtloaqxpojiggahgvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137691.9912906-123-129578928435114/AnsiballZ_stat.py'
Feb 26 20:28:12 compute-0 sudo[88104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:12 compute-0 python3.9[88107]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:12 compute-0 sudo[88104]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:12 compute-0 sudo[88183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejvjqjsnpwvyzcnggptzvxwbrisermgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137691.9912906-123-129578928435114/AnsiballZ_file.py'
Feb 26 20:28:12 compute-0 sudo[88183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:12 compute-0 python3.9[88186]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:12 compute-0 sudo[88183]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:13 compute-0 sudo[88336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plmayhlrmyvemzyerkwccnutvcmevcde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137693.046463-136-250494754365332/AnsiballZ_command.py'
Feb 26 20:28:13 compute-0 sudo[88336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:13 compute-0 python3.9[88339]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:13 compute-0 sudo[88336]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:14 compute-0 sudo[88490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqxjldudcccfqxsvhikhhajaahpabxyd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772137693.767406-144-67869522382083/AnsiballZ_edpm_nftables_from_files.py'
Feb 26 20:28:14 compute-0 sudo[88490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:14 compute-0 python3[88493]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 26 20:28:14 compute-0 sudo[88490]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:14 compute-0 sudo[88643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-astywvtfqfcnangmroibiupagcawajps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137694.5550065-152-68652660989437/AnsiballZ_stat.py'
Feb 26 20:28:14 compute-0 sudo[88643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:14 compute-0 python3.9[88646]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:15 compute-0 sudo[88643]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:15 compute-0 sudo[88769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmksjitduoausoglyvrsytzhlgpasgtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137694.5550065-152-68652660989437/AnsiballZ_copy.py'
Feb 26 20:28:15 compute-0 sudo[88769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:15 compute-0 python3.9[88772]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137694.5550065-152-68652660989437/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:15 compute-0 sudo[88769]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:16 compute-0 sudo[88922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkdopxhvgqsafnkmoqhcepzkubnckon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137695.790472-167-10792842676745/AnsiballZ_stat.py'
Feb 26 20:28:16 compute-0 sudo[88922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:16 compute-0 python3.9[88925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:16 compute-0 sudo[88922]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:16 compute-0 sudo[89048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlwkxmxebdaramcknbtajtkkhqokpimr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137695.790472-167-10792842676745/AnsiballZ_copy.py'
Feb 26 20:28:16 compute-0 sudo[89048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:16 compute-0 python3.9[89051]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137695.790472-167-10792842676745/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:16 compute-0 sudo[89048]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:17 compute-0 sudo[89201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbcgawnocxlxwojrmmzqjybneiasquse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137697.0569057-182-3712531757733/AnsiballZ_stat.py'
Feb 26 20:28:17 compute-0 sudo[89201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:17 compute-0 python3.9[89204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:17 compute-0 sudo[89201]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:17 compute-0 sudo[89327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ougpszkgahvhmoupusmebjaffedjnfju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137697.0569057-182-3712531757733/AnsiballZ_copy.py'
Feb 26 20:28:17 compute-0 sudo[89327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:18 compute-0 python3.9[89330]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137697.0569057-182-3712531757733/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:18 compute-0 sudo[89327]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:18 compute-0 sudo[89480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okcwrxhztjurhzlqkcjsohilqymrqpxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137698.2431743-197-45865507641571/AnsiballZ_stat.py'
Feb 26 20:28:18 compute-0 sudo[89480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:18 compute-0 python3.9[89483]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:18 compute-0 sudo[89480]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:18 compute-0 sudo[89606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pglvywrdjitfgvvqchstxyweuixcluhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137698.2431743-197-45865507641571/AnsiballZ_copy.py'
Feb 26 20:28:18 compute-0 sudo[89606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:19 compute-0 python3.9[89609]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137698.2431743-197-45865507641571/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:19 compute-0 sudo[89606]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:19 compute-0 sudo[89759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heswjfefzowfiqqlmjlilflfhstnvtxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137699.28961-212-139791183795823/AnsiballZ_stat.py'
Feb 26 20:28:19 compute-0 sudo[89759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:19 compute-0 python3.9[89762]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:19 compute-0 sudo[89759]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:20 compute-0 sudo[89885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyusjyoafarrrxipgmmltuqrbzssfcfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137699.28961-212-139791183795823/AnsiballZ_copy.py'
Feb 26 20:28:20 compute-0 sudo[89885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:20 compute-0 python3.9[89888]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137699.28961-212-139791183795823/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:20 compute-0 sudo[89885]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:20 compute-0 sudo[90038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhgjyopfnponfinjodwsugfhbsvaqifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137700.5610263-227-98616384563938/AnsiballZ_file.py'
Feb 26 20:28:20 compute-0 sudo[90038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:21 compute-0 python3.9[90041]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:21 compute-0 sudo[90038]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:21 compute-0 sudo[90191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twgpurgrzpuzmthlhzkccbtxaenrlwxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137701.2573369-235-970735267828/AnsiballZ_command.py'
Feb 26 20:28:21 compute-0 sudo[90191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:21 compute-0 python3.9[90194]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:21 compute-0 sudo[90191]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:22 compute-0 sudo[90347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbzigpanvqffiqxfqpuabeknxsndinfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137701.8126035-243-237955266030568/AnsiballZ_blockinfile.py'
Feb 26 20:28:22 compute-0 sudo[90347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:22 compute-0 python3.9[90350]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:22 compute-0 sudo[90347]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:22 compute-0 sudo[90500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcujratipyghmbavpmimirksmbjtipjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137702.5855544-252-207851901161110/AnsiballZ_command.py'
Feb 26 20:28:22 compute-0 sudo[90500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:22 compute-0 python3.9[90503]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:22 compute-0 sudo[90500]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:23 compute-0 sudo[90654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbzypyjicdrpteujxzyruzwfriformyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137703.1267-260-236118956082525/AnsiballZ_stat.py'
Feb 26 20:28:23 compute-0 sudo[90654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:23 compute-0 python3.9[90657]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:28:23 compute-0 sudo[90654]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:23 compute-0 sudo[90809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xklzyrsgxecqzmtmkjhkjinmrbjkgyty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137703.6787379-268-103318545947501/AnsiballZ_command.py'
Feb 26 20:28:23 compute-0 sudo[90809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:24 compute-0 python3.9[90812]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:24 compute-0 sudo[90809]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:24 compute-0 sudo[90965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwmrbvcqwzccmwsaeujatbmegckmstyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137704.2380857-276-13225890234626/AnsiballZ_file.py'
Feb 26 20:28:24 compute-0 sudo[90965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:24 compute-0 python3.9[90968]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:24 compute-0 sudo[90965]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:25 compute-0 python3.9[91118]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:28:26 compute-0 sudo[91269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmjvfxikeokqvenqsxdemxektmsdkmar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137706.2585027-317-165497040106170/AnsiballZ_command.py'
Feb 26 20:28:26 compute-0 sudo[91269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:26 compute-0 python3.9[91272]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:e0:eb:c4:a5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:26 compute-0 ovs-vsctl[91273]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:e0:eb:c4:a5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 26 20:28:26 compute-0 sudo[91269]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:27 compute-0 sudo[91423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txutctjimxmwxwprmhbjgwxricqkkfaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137706.8236382-326-247603628848926/AnsiballZ_command.py'
Feb 26 20:28:27 compute-0 sudo[91423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:27 compute-0 python3.9[91426]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:27 compute-0 sudo[91423]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:27 compute-0 sudo[91579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzroaqdzbgbbrpcummlddtekmadvyetg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137707.4645247-334-68667039016232/AnsiballZ_command.py'
Feb 26 20:28:27 compute-0 sudo[91579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:27 compute-0 python3.9[91582]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:27 compute-0 ovs-vsctl[91583]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 26 20:28:27 compute-0 sudo[91579]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:28 compute-0 python3.9[91733]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:28:28 compute-0 sudo[91885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoebfhafgnykjzriuyimryhbncrxjwgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137708.5888348-351-53913448467536/AnsiballZ_file.py'
Feb 26 20:28:28 compute-0 sudo[91885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:29 compute-0 python3.9[91888]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:29 compute-0 sudo[91885]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:29 compute-0 sudo[92038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydcolluywgjmscrpmgquzvfxyucfvapt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137709.1697588-359-48457970686127/AnsiballZ_stat.py'
Feb 26 20:28:29 compute-0 sudo[92038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:29 compute-0 python3.9[92041]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:29 compute-0 sudo[92038]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:29 compute-0 sudo[92117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrqymvpxypxoajppymvrixdzwrdtnupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137709.1697588-359-48457970686127/AnsiballZ_file.py'
Feb 26 20:28:29 compute-0 sudo[92117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:29 compute-0 python3.9[92120]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:29 compute-0 sudo[92117]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:30 compute-0 sudo[92270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osjmyhaovnpvaggwuxvjdlpgkfqrsput ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137710.1122193-359-123322195040192/AnsiballZ_stat.py'
Feb 26 20:28:30 compute-0 sudo[92270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:30 compute-0 python3.9[92273]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:30 compute-0 sudo[92270]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:30 compute-0 sudo[92349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgupyieachmndkpoowppldtndxktkguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137710.1122193-359-123322195040192/AnsiballZ_file.py'
Feb 26 20:28:30 compute-0 sudo[92349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:31 compute-0 python3.9[92352]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:31 compute-0 sudo[92349]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:31 compute-0 sudo[92502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azlavbjvicgpgogecnsleagwosznuhdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137711.1569002-382-140216310476107/AnsiballZ_file.py'
Feb 26 20:28:31 compute-0 sudo[92502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:31 compute-0 python3.9[92505]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:31 compute-0 sudo[92502]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:31 compute-0 sudo[92655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axlkenodwhfqbvpscgorqaeolmzmdxtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137711.70567-390-121546433530871/AnsiballZ_stat.py'
Feb 26 20:28:31 compute-0 sudo[92655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:32 compute-0 python3.9[92658]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:32 compute-0 sudo[92655]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:32 compute-0 sudo[92734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwnardkieuzzttungasibhrjibtzwcvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137711.70567-390-121546433530871/AnsiballZ_file.py'
Feb 26 20:28:32 compute-0 sudo[92734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:32 compute-0 python3.9[92737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:32 compute-0 sudo[92734]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:32 compute-0 sudo[92887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bujmjyqlowicdrzmfzfkodtjykqrlrqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137712.6621163-402-239236605931870/AnsiballZ_stat.py'
Feb 26 20:28:32 compute-0 sudo[92887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:33 compute-0 python3.9[92890]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:33 compute-0 sudo[92887]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:33 compute-0 sudo[92966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikmfxyceomcgtxzxusaeldpuaebfqhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137712.6621163-402-239236605931870/AnsiballZ_file.py'
Feb 26 20:28:33 compute-0 sudo[92966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:33 compute-0 python3.9[92969]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:33 compute-0 sudo[92966]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:33 compute-0 sudo[93119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiczqdeohtkxeusgudujxaqenkzsfeal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137713.6856341-414-108396246982428/AnsiballZ_systemd.py'
Feb 26 20:28:33 compute-0 sudo[93119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:34 compute-0 python3.9[93122]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:28:34 compute-0 systemd[1]: Reloading.
Feb 26 20:28:34 compute-0 systemd-rc-local-generator[93140]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:28:34 compute-0 systemd-sysv-generator[93144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:28:34 compute-0 sudo[93119]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:34 compute-0 sudo[93316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgurtifphwlrkwustdasndzejhkllopg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137714.6367319-422-174123152733350/AnsiballZ_stat.py'
Feb 26 20:28:34 compute-0 sudo[93316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:35 compute-0 python3.9[93319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:35 compute-0 sudo[93316]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:35 compute-0 sudo[93395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbefophkwhzbfsbukxdvnvwsxnsxyicn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137714.6367319-422-174123152733350/AnsiballZ_file.py'
Feb 26 20:28:35 compute-0 sudo[93395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:35 compute-0 python3.9[93398]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:35 compute-0 sudo[93395]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:36 compute-0 sudo[93548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebmwekectjkbyraegjowlixflyxgjvgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137715.775754-434-49986258379252/AnsiballZ_stat.py'
Feb 26 20:28:36 compute-0 sudo[93548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:36 compute-0 python3.9[93551]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:36 compute-0 sudo[93548]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:37 compute-0 sudo[93628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvkdedfhfgsbvgcgltemeufxnbnnnzty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137715.775754-434-49986258379252/AnsiballZ_file.py'
Feb 26 20:28:37 compute-0 sudo[93628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:37 compute-0 python3.9[93631]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:37 compute-0 sudo[93628]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:37 compute-0 sudo[93781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzcnuljqbsjnpxubjirmesctbsdrhorw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137717.5130737-446-137538729960286/AnsiballZ_systemd.py'
Feb 26 20:28:37 compute-0 sudo[93781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:38 compute-0 python3.9[93784]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:28:38 compute-0 systemd[1]: Reloading.
Feb 26 20:28:38 compute-0 systemd-rc-local-generator[93813]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:28:38 compute-0 systemd-sysv-generator[93818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:28:38 compute-0 systemd[1]: Starting Create netns directory...
Feb 26 20:28:38 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 26 20:28:38 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 26 20:28:38 compute-0 systemd[1]: Finished Create netns directory.
Feb 26 20:28:38 compute-0 sudo[93781]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:38 compute-0 sudo[93982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufxjtnonsqrencqyhttomfsttedwtqgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137718.6503317-456-249032995461018/AnsiballZ_file.py'
Feb 26 20:28:38 compute-0 sudo[93982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:39 compute-0 python3.9[93985]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:39 compute-0 sudo[93982]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:39 compute-0 sudo[94135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wydhnplnxqfzgrcqxtdiiyxunpqxnvmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137719.2460241-464-93264776430822/AnsiballZ_stat.py'
Feb 26 20:28:39 compute-0 sudo[94135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:39 compute-0 python3.9[94138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:39 compute-0 sudo[94135]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:39 compute-0 sudo[94259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpbbbzztzpjdscvibytytrlgzjcvllyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137719.2460241-464-93264776430822/AnsiballZ_copy.py'
Feb 26 20:28:40 compute-0 sudo[94259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:40 compute-0 python3.9[94262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137719.2460241-464-93264776430822/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:40 compute-0 sudo[94259]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:40 compute-0 sudo[94412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upchjldtwuxknkhzmrflsjjszbapedgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137720.5159738-481-236723922085892/AnsiballZ_file.py'
Feb 26 20:28:40 compute-0 sudo[94412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:40 compute-0 python3.9[94415]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:40 compute-0 sudo[94412]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:41 compute-0 sudo[94565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlwywyjytbxasxjghggjfytqeifvqwqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137721.0736988-489-37075547308278/AnsiballZ_file.py'
Feb 26 20:28:41 compute-0 sudo[94565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:41 compute-0 python3.9[94568]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:28:41 compute-0 sudo[94565]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:41 compute-0 sudo[94718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzziimfmrdkxrhicwfhjuhywmocnyyxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137721.7207067-497-55191321692396/AnsiballZ_stat.py'
Feb 26 20:28:41 compute-0 sudo[94718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:42 compute-0 python3.9[94721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:42 compute-0 sudo[94718]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:42 compute-0 sudo[94842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmncethutiqvrdylksnijaexjjmdzjyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137721.7207067-497-55191321692396/AnsiballZ_copy.py'
Feb 26 20:28:42 compute-0 sudo[94842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:42 compute-0 python3.9[94845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137721.7207067-497-55191321692396/.source.json _original_basename=.ezo3_t6v follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:42 compute-0 sudo[94842]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:43 compute-0 python3.9[94995]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:45 compute-0 sudo[95416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvfvwdcqjtbqtwdpphljvdfzyerfmwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137724.6617658-537-177260844985553/AnsiballZ_container_config_data.py'
Feb 26 20:28:45 compute-0 sudo[95416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:45 compute-0 python3.9[95419]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 26 20:28:45 compute-0 sudo[95416]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:46 compute-0 sudo[95569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdfexkqwzhgpdbsxtilechjpedwcmvzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137725.627829-548-68782535325067/AnsiballZ_container_config_hash.py'
Feb 26 20:28:46 compute-0 sudo[95569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:46 compute-0 python3.9[95572]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:28:46 compute-0 sudo[95569]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:47 compute-0 sudo[95722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzcrxpkttinkvuhyvlrkrjpsuyqwofzw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772137726.5338962-558-129322762802217/AnsiballZ_edpm_container_manage.py'
Feb 26 20:28:47 compute-0 sudo[95722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:47 compute-0 python3[95725]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:28:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:28:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:28:47 compute-0 podman[95761]: 2026-02-26 20:28:47.480396865 +0000 UTC m=+0.048875841 container create c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:28:47 compute-0 podman[95761]: 2026-02-26 20:28:47.450754821 +0000 UTC m=+0.019233827 image pull ce6781f051bf092c13d84cb587c56ad7edaa58b70fcc0effc1dff15724d5232e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 26 20:28:47 compute-0 python3[95725]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 26 20:28:47 compute-0 sudo[95722]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:48 compute-0 sudo[95950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sotpsuolwrlvoqodtpwrazemnvgaheox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137727.8023126-566-31662290761356/AnsiballZ_stat.py'
Feb 26 20:28:48 compute-0 sudo[95950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:48 compute-0 python3.9[95953]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:28:48 compute-0 sudo[95950]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 26 20:28:48 compute-0 sudo[96105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcbcqgranfxjwhxqgexrjcmayahasohy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137728.4943247-575-237818822167995/AnsiballZ_file.py'
Feb 26 20:28:48 compute-0 sudo[96105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:48 compute-0 python3.9[96108]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:48 compute-0 sudo[96105]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:49 compute-0 sudo[96182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqqjwkoeiqhgkwgmcdeiklkxizoitopm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137728.4943247-575-237818822167995/AnsiballZ_stat.py'
Feb 26 20:28:49 compute-0 sudo[96182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:49 compute-0 python3.9[96185]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:28:49 compute-0 sudo[96182]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:49 compute-0 sudo[96334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilieavgsnmhrszbwzvzihsioqpzzwlnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137729.443394-575-227596848338312/AnsiballZ_copy.py'
Feb 26 20:28:49 compute-0 sudo[96334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:49 compute-0 python3.9[96337]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772137729.443394-575-227596848338312/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:49 compute-0 sudo[96334]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:50 compute-0 sudo[96411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admxwqldgooppgucqwspjyhvukzoteqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137729.443394-575-227596848338312/AnsiballZ_systemd.py'
Feb 26 20:28:50 compute-0 sudo[96411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:50 compute-0 python3.9[96414]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:28:50 compute-0 systemd[1]: Reloading.
Feb 26 20:28:50 compute-0 systemd-rc-local-generator[96441]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:28:50 compute-0 systemd-sysv-generator[96446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:28:50 compute-0 sudo[96411]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:51 compute-0 sudo[96530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmrgyrvjmdhxdadejuhtshyfwqnwkip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137729.443394-575-227596848338312/AnsiballZ_systemd.py'
Feb 26 20:28:51 compute-0 sudo[96530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:51 compute-0 python3.9[96533]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:28:51 compute-0 systemd[1]: Reloading.
Feb 26 20:28:51 compute-0 systemd-rc-local-generator[96565]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:28:51 compute-0 systemd-sysv-generator[96569]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:28:51 compute-0 systemd[1]: Starting ovn_controller container...
Feb 26 20:28:51 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 26 20:28:51 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:28:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbadd8b85376194f875df07d5f09c071ab655049ef602665beffc9c067c85de9/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 26 20:28:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25.
Feb 26 20:28:51 compute-0 podman[96582]: 2026-02-26 20:28:51.797297091 +0000 UTC m=+0.127701142 container init c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 26 20:28:51 compute-0 ovn_controller[96598]: + sudo -E kolla_set_configs
Feb 26 20:28:51 compute-0 podman[96582]: 2026-02-26 20:28:51.825619611 +0000 UTC m=+0.156023662 container start c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 26 20:28:51 compute-0 edpm-start-podman-container[96582]: ovn_controller
Feb 26 20:28:51 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 26 20:28:51 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 26 20:28:51 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 26 20:28:51 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 26 20:28:51 compute-0 systemd[96635]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 26 20:28:51 compute-0 podman[96605]: 2026-02-26 20:28:51.950462014 +0000 UTC m=+0.114804196 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:28:51 compute-0 systemd[1]: c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25-3e59d1fcd2d3175b.service: Main process exited, code=exited, status=1/FAILURE
Feb 26 20:28:51 compute-0 systemd[1]: c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25-3e59d1fcd2d3175b.service: Failed with result 'exit-code'.
Feb 26 20:28:51 compute-0 edpm-start-podman-container[96581]: Creating additional drop-in dependency for "ovn_controller" (c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25)
Feb 26 20:28:51 compute-0 systemd[1]: Reloading.
Feb 26 20:28:52 compute-0 systemd-rc-local-generator[96675]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:28:52 compute-0 systemd-sysv-generator[96683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:28:52 compute-0 systemd[96635]: Queued start job for default target Main User Target.
Feb 26 20:28:52 compute-0 systemd[96635]: Created slice User Application Slice.
Feb 26 20:28:52 compute-0 systemd[96635]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 26 20:28:52 compute-0 systemd[96635]: Started Daily Cleanup of User's Temporary Directories.
Feb 26 20:28:52 compute-0 systemd[96635]: Reached target Paths.
Feb 26 20:28:52 compute-0 systemd[96635]: Reached target Timers.
Feb 26 20:28:52 compute-0 systemd[96635]: Starting D-Bus User Message Bus Socket...
Feb 26 20:28:52 compute-0 systemd[96635]: Starting Create User's Volatile Files and Directories...
Feb 26 20:28:52 compute-0 systemd[96635]: Finished Create User's Volatile Files and Directories.
Feb 26 20:28:52 compute-0 systemd[96635]: Listening on D-Bus User Message Bus Socket.
Feb 26 20:28:52 compute-0 systemd[96635]: Reached target Sockets.
Feb 26 20:28:52 compute-0 systemd[96635]: Reached target Basic System.
Feb 26 20:28:52 compute-0 systemd[96635]: Reached target Main User Target.
Feb 26 20:28:52 compute-0 systemd[96635]: Startup finished in 119ms.
Feb 26 20:28:52 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 26 20:28:52 compute-0 systemd[1]: Started ovn_controller container.
Feb 26 20:28:52 compute-0 systemd[1]: Started Session c1 of User root.
Feb 26 20:28:52 compute-0 sudo[96530]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:52 compute-0 ovn_controller[96598]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 26 20:28:52 compute-0 ovn_controller[96598]: INFO:__main__:Validating config file
Feb 26 20:28:52 compute-0 ovn_controller[96598]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 26 20:28:52 compute-0 ovn_controller[96598]: INFO:__main__:Writing out command to execute
Feb 26 20:28:52 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: ++ cat /run_command
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + ARGS=
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + sudo kolla_copy_cacerts
Feb 26 20:28:52 compute-0 systemd[1]: Started Session c2 of User root.
Feb 26 20:28:52 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + [[ ! -n '' ]]
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + . kolla_extend_start
Feb 26 20:28:52 compute-0 ovn_controller[96598]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + umask 0022
Feb 26 20:28:52 compute-0 ovn_controller[96598]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3121] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3132] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <warn>  [1772137732.3135] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3144] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3151] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3154] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 26 20:28:52 compute-0 kernel: br-int: entered promiscuous mode
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00019|main|INFO|OVS feature set changed, force recompute.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 26 20:28:52 compute-0 ovn_controller[96598]: 2026-02-26T20:28:52Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3381] manager: (ovn-89a7a8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 26 20:28:52 compute-0 systemd-udevd[96736]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:28:52 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 26 20:28:52 compute-0 systemd-udevd[96737]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3567] device (genev_sys_6081): carrier: link connected
Feb 26 20:28:52 compute-0 NetworkManager[56360]: <info>  [1772137732.3572] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 26 20:28:53 compute-0 python3.9[96865]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:28:53 compute-0 sudo[97015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgplrnnryysstuhjklytuwikgdoeecrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137733.3877647-620-152199407733374/AnsiballZ_stat.py'
Feb 26 20:28:53 compute-0 sudo[97015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:53 compute-0 python3.9[97018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:28:53 compute-0 sudo[97015]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:54 compute-0 sudo[97139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfjmwpqbsllxfotgfpcqhhtgleqxtnqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137733.3877647-620-152199407733374/AnsiballZ_copy.py'
Feb 26 20:28:54 compute-0 sudo[97139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:54 compute-0 python3.9[97142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137733.3877647-620-152199407733374/.source.yaml _original_basename=.cl2mpn6y follow=False checksum=21ae5417cb141682d0f5f6e14745e660083614bc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:28:54 compute-0 sudo[97139]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:54 compute-0 sudo[97292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eewfemidlvbeiiyykdqtuqtgrmpyxinx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137734.540563-635-275928226339629/AnsiballZ_command.py'
Feb 26 20:28:54 compute-0 sudo[97292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:55 compute-0 python3.9[97295]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:55 compute-0 ovs-vsctl[97296]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 26 20:28:55 compute-0 sudo[97292]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:55 compute-0 sudo[97446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzklvonhpioxuqkgtaldmdtpnghdmobl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137735.2224276-643-134491640251727/AnsiballZ_command.py'
Feb 26 20:28:55 compute-0 sudo[97446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:55 compute-0 python3.9[97449]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:55 compute-0 ovs-vsctl[97451]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 26 20:28:55 compute-0 sudo[97446]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:56 compute-0 sudo[97602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pspcwmmjvpgyucvgjqyifhyxcwtinxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137736.013993-657-130110236884648/AnsiballZ_command.py'
Feb 26 20:28:56 compute-0 sudo[97602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:28:56 compute-0 python3.9[97605]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:28:56 compute-0 ovs-vsctl[97606]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 26 20:28:56 compute-0 sudo[97602]: pam_unix(sudo:session): session closed for user root
Feb 26 20:28:56 compute-0 sshd-session[86022]: Connection closed by 192.168.122.30 port 52328
Feb 26 20:28:56 compute-0 sshd-session[86019]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:28:56 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 26 20:28:56 compute-0 systemd[1]: session-19.scope: Consumed 40.854s CPU time.
Feb 26 20:28:56 compute-0 systemd-logind[825]: Session 19 logged out. Waiting for processes to exit.
Feb 26 20:28:56 compute-0 systemd-logind[825]: Removed session 19.
Feb 26 20:29:01 compute-0 sshd-session[97631]: Accepted publickey for zuul from 192.168.122.30 port 50268 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:29:01 compute-0 systemd-logind[825]: New session 21 of user zuul.
Feb 26 20:29:01 compute-0 systemd[1]: Started Session 21 of User zuul.
Feb 26 20:29:01 compute-0 sshd-session[97631]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:29:02 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 26 20:29:02 compute-0 systemd[96635]: Activating special unit Exit the Session...
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped target Main User Target.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped target Basic System.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped target Paths.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped target Sockets.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped target Timers.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 26 20:29:02 compute-0 systemd[96635]: Closed D-Bus User Message Bus Socket.
Feb 26 20:29:02 compute-0 systemd[96635]: Stopped Create User's Volatile Files and Directories.
Feb 26 20:29:02 compute-0 systemd[96635]: Removed slice User Application Slice.
Feb 26 20:29:02 compute-0 systemd[96635]: Reached target Shutdown.
Feb 26 20:29:02 compute-0 systemd[96635]: Finished Exit the Session.
Feb 26 20:29:02 compute-0 systemd[96635]: Reached target Exit the Session.
Feb 26 20:29:02 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 26 20:29:02 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 26 20:29:02 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 26 20:29:02 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 26 20:29:02 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 26 20:29:02 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 26 20:29:02 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 26 20:29:02 compute-0 python3.9[97786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:29:03 compute-0 sudo[97940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umwwwkktosmehzusvpervfwdjyblfget ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137743.2752302-29-243161313725895/AnsiballZ_file.py'
Feb 26 20:29:03 compute-0 sudo[97940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:03 compute-0 python3.9[97943]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:03 compute-0 sudo[97940]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:04 compute-0 sudo[98093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpyhebognauysfvelwwpkzverubuukhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137743.942849-29-163910467425111/AnsiballZ_file.py'
Feb 26 20:29:04 compute-0 sudo[98093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:04 compute-0 python3.9[98096]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:04 compute-0 sudo[98093]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:04 compute-0 sudo[98246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uamfkwmmmxaaqawjmfeguexvyzhtbalu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137744.4898853-29-5003033925056/AnsiballZ_file.py'
Feb 26 20:29:04 compute-0 sudo[98246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:04 compute-0 python3.9[98249]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:04 compute-0 sudo[98246]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:05 compute-0 sudo[98399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqnmbindcopxuujiwjosvvpelklpkdvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137744.992636-29-29343738072298/AnsiballZ_file.py'
Feb 26 20:29:05 compute-0 sudo[98399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:05 compute-0 python3.9[98402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:05 compute-0 sudo[98399]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:05 compute-0 sudo[98552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stgunqaziwgyupnkndyrcunayvvbzmcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137745.5699534-29-192404450901970/AnsiballZ_file.py'
Feb 26 20:29:05 compute-0 sudo[98552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:06 compute-0 python3.9[98555]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:06 compute-0 sudo[98552]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:06 compute-0 python3.9[98705]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:29:07 compute-0 sudo[98855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmawqpycorssmbmcvmmuqukqdbyrtnvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137746.9756505-73-95251542555691/AnsiballZ_seboolean.py'
Feb 26 20:29:07 compute-0 sudo[98855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:07 compute-0 python3.9[98858]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 26 20:29:08 compute-0 sudo[98855]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:08 compute-0 python3.9[99008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:09 compute-0 python3.9[99129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137748.2846189-81-257400566196810/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:10 compute-0 python3.9[99279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:10 compute-0 python3.9[99401]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137749.6814482-96-255432780743310/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:11 compute-0 sudo[99551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iudioljabtnmqzruibxssarpdrpiqonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137750.8389668-113-96275792709882/AnsiballZ_setup.py'
Feb 26 20:29:11 compute-0 sudo[99551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:11 compute-0 python3.9[99554]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:29:11 compute-0 sudo[99551]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:12 compute-0 sudo[99636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjcskikaqoymergnhjcryfzpbeewxoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137750.8389668-113-96275792709882/AnsiballZ_dnf.py'
Feb 26 20:29:12 compute-0 sudo[99636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:12 compute-0 python3.9[99639]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:29:13 compute-0 sudo[99636]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:14 compute-0 sudo[99790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndruzrvgepdxjsggvujstjponnfmmnir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137753.8530602-125-83240345434402/AnsiballZ_systemd.py'
Feb 26 20:29:14 compute-0 sudo[99790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:14 compute-0 python3.9[99793]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:29:14 compute-0 sudo[99790]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:15 compute-0 python3.9[99946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:15 compute-0 python3.9[100067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137754.9789863-133-68835054008870/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:16 compute-0 python3.9[100217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:16 compute-0 python3.9[100338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137756.058689-133-153106140076889/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:18 compute-0 python3.9[100488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:18 compute-0 python3.9[100609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137757.6100795-177-247676518865043/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:19 compute-0 python3.9[100759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:19 compute-0 python3.9[100880]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137758.6555133-177-28636979913435/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:20 compute-0 python3.9[101030]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:29:20 compute-0 sudo[101182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frabwbuecfotscwhsmytmrdgjwkibvyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137760.4357345-215-219529821979558/AnsiballZ_file.py'
Feb 26 20:29:20 compute-0 sudo[101182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:20 compute-0 python3.9[101185]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:20 compute-0 sudo[101182]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:21 compute-0 sudo[101335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvhkyfeutfimunaufnelfiazkmuvdvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137761.054807-223-229943083143147/AnsiballZ_stat.py'
Feb 26 20:29:21 compute-0 sudo[101335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:21 compute-0 python3.9[101338]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:21 compute-0 sudo[101335]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:21 compute-0 sudo[101414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njdugaxxbvnfgehmurytnrwikvlydghq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137761.054807-223-229943083143147/AnsiballZ_file.py'
Feb 26 20:29:21 compute-0 sudo[101414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:21 compute-0 python3.9[101417]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:22 compute-0 sudo[101414]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:22 compute-0 ovn_controller[96598]: 2026-02-26T20:29:22Z|00025|memory|INFO|16896 kB peak resident set size after 29.8 seconds
Feb 26 20:29:22 compute-0 ovn_controller[96598]: 2026-02-26T20:29:22Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Feb 26 20:29:22 compute-0 podman[101418]: 2026-02-26 20:29:22.128311251 +0000 UTC m=+0.089360604 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 26 20:29:22 compute-0 sudo[101593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmtnhczthfpysnxtginwtpujdosrsscl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137762.1337912-223-266626591322733/AnsiballZ_stat.py'
Feb 26 20:29:22 compute-0 sudo[101593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:22 compute-0 python3.9[101596]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:22 compute-0 sudo[101593]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:22 compute-0 sudo[101672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rilekirveqkhjpguvaaoyupurqwkskxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137762.1337912-223-266626591322733/AnsiballZ_file.py'
Feb 26 20:29:22 compute-0 sudo[101672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:22 compute-0 python3.9[101675]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:22 compute-0 sudo[101672]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:23 compute-0 sudo[101825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zklpjwlswcbnsqbwrqljfxgbccoknfwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137763.0877502-246-48331680742347/AnsiballZ_file.py'
Feb 26 20:29:23 compute-0 sudo[101825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:23 compute-0 python3.9[101828]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:23 compute-0 sudo[101825]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:23 compute-0 sudo[101978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqiaksexiwzkrryauwwdakibmdunhtlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137763.7602236-254-74353246528425/AnsiballZ_stat.py'
Feb 26 20:29:24 compute-0 sudo[101978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:24 compute-0 python3.9[101981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:24 compute-0 sudo[101978]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:24 compute-0 sudo[102057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmbtxebxhlrosntuwgravfmkwkorfvsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137763.7602236-254-74353246528425/AnsiballZ_file.py'
Feb 26 20:29:24 compute-0 sudo[102057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:24 compute-0 python3.9[102060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:24 compute-0 sudo[102057]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:25 compute-0 sudo[102210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unesfdyqzhqpyvmibditjylrwwwuskfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137764.8054945-266-218705373093832/AnsiballZ_stat.py'
Feb 26 20:29:25 compute-0 sudo[102210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:25 compute-0 python3.9[102213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:25 compute-0 sudo[102210]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:25 compute-0 sudo[102289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cldggowhezxjkkxovxfpghabexquvpda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137764.8054945-266-218705373093832/AnsiballZ_file.py'
Feb 26 20:29:25 compute-0 sudo[102289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:25 compute-0 python3.9[102292]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:25 compute-0 sudo[102289]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:26 compute-0 sudo[102442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsefhxboeumnexttvyidxheyoobuzupg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137765.9486265-278-139082949699596/AnsiballZ_systemd.py'
Feb 26 20:29:26 compute-0 sudo[102442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:26 compute-0 python3.9[102445]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:29:26 compute-0 systemd[1]: Reloading.
Feb 26 20:29:26 compute-0 systemd-sysv-generator[102475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:26 compute-0 systemd-rc-local-generator[102468]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:26 compute-0 sudo[102442]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:27 compute-0 sudo[102640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbohboogjkwcdemdyayrijxmapcbxuyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137767.0315182-286-91966974518884/AnsiballZ_stat.py'
Feb 26 20:29:27 compute-0 sudo[102640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:27 compute-0 python3.9[102643]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:27 compute-0 sudo[102640]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:27 compute-0 sudo[102719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqpmsxzmqzxwavmcngvfnlssvzsdsln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137767.0315182-286-91966974518884/AnsiballZ_file.py'
Feb 26 20:29:27 compute-0 sudo[102719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:28 compute-0 python3.9[102722]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:28 compute-0 sudo[102719]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:28 compute-0 sudo[102872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfucybuaegwpsckylwjjrixmbbvwpbjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137768.1818054-298-208900868821913/AnsiballZ_stat.py'
Feb 26 20:29:28 compute-0 sudo[102872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:28 compute-0 python3.9[102875]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:28 compute-0 sudo[102872]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:28 compute-0 sudo[102951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lphupscgvpriaihgderpflnrvtwajwad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137768.1818054-298-208900868821913/AnsiballZ_file.py'
Feb 26 20:29:28 compute-0 sudo[102951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:29 compute-0 python3.9[102954]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:29 compute-0 sudo[102951]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:29 compute-0 sudo[103104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmuzmklhavlsbwfymbkdjjeuzuienyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137769.4708712-310-23380418019961/AnsiballZ_systemd.py'
Feb 26 20:29:29 compute-0 sudo[103104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:30 compute-0 python3.9[103107]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:29:30 compute-0 systemd[1]: Reloading.
Feb 26 20:29:30 compute-0 systemd-rc-local-generator[103134]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:30 compute-0 systemd-sysv-generator[103139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:30 compute-0 systemd[1]: Starting Create netns directory...
Feb 26 20:29:30 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 26 20:29:30 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 26 20:29:30 compute-0 systemd[1]: Finished Create netns directory.
Feb 26 20:29:30 compute-0 sudo[103104]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:30 compute-0 sudo[103308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pantbzqlhgjukzbvbjqfcsihcjyxmrvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137770.6055253-320-24640528823247/AnsiballZ_file.py'
Feb 26 20:29:30 compute-0 sudo[103308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:31 compute-0 python3.9[103311]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:31 compute-0 sudo[103308]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:31 compute-0 sudo[103461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqddrblmshvjknrecxzajfcijhuqoxix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137771.233489-328-261266723661677/AnsiballZ_stat.py'
Feb 26 20:29:31 compute-0 sudo[103461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:31 compute-0 python3.9[103464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:31 compute-0 sudo[103461]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:32 compute-0 sudo[103585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrxyfzkscvqnzjeckcgghbgxlcxscnio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137771.233489-328-261266723661677/AnsiballZ_copy.py'
Feb 26 20:29:32 compute-0 sudo[103585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:32 compute-0 python3.9[103588]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772137771.233489-328-261266723661677/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:32 compute-0 sudo[103585]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:32 compute-0 sudo[103738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyuuvkewpkbnnihyiyydtyfcteuwypxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137772.564166-345-81078490140543/AnsiballZ_file.py'
Feb 26 20:29:32 compute-0 sudo[103738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:33 compute-0 python3.9[103741]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:33 compute-0 sudo[103738]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:33 compute-0 sudo[103891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzszjlzzfcbcawxnwgrqagtsbnpbvcqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137773.2046914-353-88219256673974/AnsiballZ_file.py'
Feb 26 20:29:33 compute-0 sudo[103891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:33 compute-0 python3.9[103894]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:29:33 compute-0 sudo[103891]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:34 compute-0 sudo[104044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsabmuwwbefqcqovnyrehargkttktqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137773.9280934-361-197625224903875/AnsiballZ_stat.py'
Feb 26 20:29:34 compute-0 sudo[104044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:34 compute-0 python3.9[104047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:34 compute-0 sudo[104044]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:34 compute-0 sudo[104168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyejcjaigdzkvjskpbrlqxgppbqhdbpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137773.9280934-361-197625224903875/AnsiballZ_copy.py'
Feb 26 20:29:34 compute-0 sudo[104168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:35 compute-0 python3.9[104171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137773.9280934-361-197625224903875/.source.json _original_basename=.tn0cmi5l follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:35 compute-0 sudo[104168]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:35 compute-0 python3.9[104321]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:37 compute-0 sudo[104742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcswgfsyzrrxonxnpdguxhjksxpvzqgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137777.1255715-401-159712312605703/AnsiballZ_container_config_data.py'
Feb 26 20:29:37 compute-0 sudo[104742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:37 compute-0 python3.9[104745]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 26 20:29:37 compute-0 sudo[104742]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:38 compute-0 sudo[104895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lclaheaniihyioqlntgnbbdgtwasoymt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137778.1737533-412-250224753096742/AnsiballZ_container_config_hash.py'
Feb 26 20:29:38 compute-0 sudo[104895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:38 compute-0 python3.9[104898]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:29:38 compute-0 sudo[104895]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:39 compute-0 sudo[105048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbrogarzioctolaxzpzadpcxszwrkuht ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772137779.0654933-422-237547882196698/AnsiballZ_edpm_container_manage.py'
Feb 26 20:29:39 compute-0 sudo[105048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:39 compute-0 python3[105051]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:29:39 compute-0 podman[105089]: 2026-02-26 20:29:39.959164961 +0000 UTC m=+0.047133472 container create a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223)
Feb 26 20:29:39 compute-0 podman[105089]: 2026-02-26 20:29:39.936094523 +0000 UTC m=+0.024063044 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:29:39 compute-0 python3[105051]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:29:40 compute-0 sudo[105048]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:40 compute-0 sudo[105277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okxmwoxorxiixyacgmfximziqtgweihi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137780.2442288-430-2802072267007/AnsiballZ_stat.py'
Feb 26 20:29:40 compute-0 sudo[105277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:40 compute-0 python3.9[105280]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:29:40 compute-0 sudo[105277]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:41 compute-0 sudo[105432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cytmmeehqfiigpqyfemlaepbjygafjso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137780.9144332-439-3488191769471/AnsiballZ_file.py'
Feb 26 20:29:41 compute-0 sudo[105432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:41 compute-0 python3.9[105435]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:41 compute-0 sudo[105432]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:41 compute-0 sudo[105509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofvyvksvszwkysvhjtoxkbgiobujkand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137780.9144332-439-3488191769471/AnsiballZ_stat.py'
Feb 26 20:29:41 compute-0 sudo[105509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:41 compute-0 python3.9[105512]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:29:41 compute-0 sudo[105509]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:42 compute-0 sudo[105661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljeqcgyplkgkilsmocpeyedhaovvgibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137781.9121163-439-222162164650545/AnsiballZ_copy.py'
Feb 26 20:29:42 compute-0 sudo[105661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:42 compute-0 python3.9[105664]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772137781.9121163-439-222162164650545/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:42 compute-0 sudo[105661]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:42 compute-0 sudo[105738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cftiepfgflebemjlnfsokfqetxpwixlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137781.9121163-439-222162164650545/AnsiballZ_systemd.py'
Feb 26 20:29:42 compute-0 sudo[105738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:43 compute-0 python3.9[105741]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:29:43 compute-0 systemd[1]: Reloading.
Feb 26 20:29:43 compute-0 systemd-sysv-generator[105770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:43 compute-0 systemd-rc-local-generator[105763]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:43 compute-0 sudo[105738]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:43 compute-0 sudo[105857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yesyzpfogenqgtqavrqnrppmethjgmck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137781.9121163-439-222162164650545/AnsiballZ_systemd.py'
Feb 26 20:29:43 compute-0 sudo[105857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:44 compute-0 python3.9[105860]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:29:44 compute-0 systemd[1]: Reloading.
Feb 26 20:29:44 compute-0 systemd-sysv-generator[105887]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:44 compute-0 systemd-rc-local-generator[105884]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:44 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 26 20:29:44 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:29:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772395033b3cb0e7feaf2542f3c037d23eaddfb1133d6115afe960e459cef252/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 26 20:29:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772395033b3cb0e7feaf2542f3c037d23eaddfb1133d6115afe960e459cef252/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:29:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0.
Feb 26 20:29:44 compute-0 podman[105908]: 2026-02-26 20:29:44.483333258 +0000 UTC m=+0.138125248 container init a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + sudo -E kolla_set_configs
Feb 26 20:29:44 compute-0 podman[105908]: 2026-02-26 20:29:44.516826325 +0000 UTC m=+0.171618385 container start a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:29:44 compute-0 edpm-start-podman-container[105908]: ovn_metadata_agent
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Validating config file
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Copying service configuration files
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Writing out command to execute
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: ++ cat /run_command
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + CMD=neutron-ovn-metadata-agent
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + ARGS=
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + sudo kolla_copy_cacerts
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + [[ ! -n '' ]]
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + . kolla_extend_start
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: Running command: 'neutron-ovn-metadata-agent'
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + umask 0022
Feb 26 20:29:44 compute-0 ovn_metadata_agent[105924]: + exec neutron-ovn-metadata-agent
Feb 26 20:29:44 compute-0 edpm-start-podman-container[105907]: Creating additional drop-in dependency for "ovn_metadata_agent" (a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0)
Feb 26 20:29:44 compute-0 systemd[1]: Reloading.
Feb 26 20:29:44 compute-0 podman[105931]: 2026-02-26 20:29:44.622029643 +0000 UTC m=+0.087985294 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0)
Feb 26 20:29:44 compute-0 systemd-sysv-generator[105997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:44 compute-0 systemd-rc-local-generator[105992]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:44 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 26 20:29:44 compute-0 sudo[105857]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:45 compute-0 python3.9[106167]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:29:46 compute-0 sudo[106317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hamnqcgncukgnppyssktdquukfccwcpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137786.0784762-484-225268640680103/AnsiballZ_stat.py'
Feb 26 20:29:46 compute-0 sudo[106317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.443 105929 INFO neutron.common.config [-] Logging enabled!
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.445 105929 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.445 105929 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.446 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.446 105929 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.446 105929 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.447 105929 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.447 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.447 105929 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.447 105929 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.447 105929 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.448 105929 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.449 105929 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.450 105929 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.451 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.452 105929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.453 105929 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.454 105929 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.455 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.456 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.457 105929 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.458 105929 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.459 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.460 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.461 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.462 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.463 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.464 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.465 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.466 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.467 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.468 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.469 105929 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.470 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.471 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.472 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.473 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.474 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.475 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.476 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.477 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.478 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.479 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.480 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.481 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.482 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.483 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.484 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.485 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.486 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.486 105929 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.486 105929 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.498 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.498 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.498 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.499 105929 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.500 105929 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.516 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 62bfa765-f40e-4724-bf05-2e8b811f0867 (UUID: 62bfa765-f40e-4724-bf05-2e8b811f0867) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.542 105929 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.542 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.542 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.543 105929 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.546 105929 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.552 105929 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.558 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '62bfa765-f40e-4724-bf05-2e8b811f0867'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], external_ids={}, name=62bfa765-f40e-4724-bf05-2e8b811f0867, nb_cfg_timestamp=1772137740339, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.559 105929 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6af43f6c40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.560 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.561 105929 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.561 105929 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.561 105929 INFO oslo_service.service [-] Starting 1 workers
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.567 105929 DEBUG oslo_service.service [-] Started child 106321 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 26 20:29:46 compute-0 python3.9[106320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.571 105929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpyxmkkmlc/privsep.sock']
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.571 106321 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-368207'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.601 106321 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 26 20:29:46 compute-0 sudo[106317]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.601 106321 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.614 106321 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.617 106321 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.623 106321 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 26 20:29:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:46.628 106321 INFO eventlet.wsgi.server [-] (106321) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 26 20:29:46 compute-0 sudo[106448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibxhlgulzqrasmbljmwrqynribdrzrbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137786.0784762-484-225268640680103/AnsiballZ_copy.py'
Feb 26 20:29:46 compute-0 sudo[106448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:47 compute-0 python3.9[106451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137786.0784762-484-225268640680103/.source.yaml _original_basename=.1cqyfb5u follow=False checksum=43b7478fdb966ac061976c66cb586f169cdbbd70 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:29:47 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 26 20:29:47 compute-0 sudo[106448]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.195 105929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.196 105929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpyxmkkmlc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.074 106452 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.077 106452 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.079 106452 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.079 106452 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106452
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.200 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[e470d7a1-afc7-4866-a55f-bc4b990a235b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:29:47 compute-0 sshd-session[97634]: Connection closed by 192.168.122.30 port 50268
Feb 26 20:29:47 compute-0 sshd-session[97631]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:29:47 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Feb 26 20:29:47 compute-0 systemd[1]: session-21.scope: Consumed 33.188s CPU time.
Feb 26 20:29:47 compute-0 systemd-logind[825]: Session 21 logged out. Waiting for processes to exit.
Feb 26 20:29:47 compute-0 systemd-logind[825]: Removed session 21.
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.684 106452 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.684 106452 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:29:47 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:47.684 106452 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.197 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[78741e2d-539c-4ff5-b029-6f002e61a25e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.200 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, column=external_ids, values=({'neutron:ovn-metadata-id': '27d58dd2-bd5c-59cf-819d-714d7070554a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.209 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.217 105929 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.218 105929 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.218 105929 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.218 105929 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.218 105929 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.218 105929 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.219 105929 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.219 105929 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.219 105929 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.219 105929 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.219 105929 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.220 105929 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.221 105929 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.221 105929 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.221 105929 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.221 105929 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.222 105929 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.223 105929 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.223 105929 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.223 105929 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.223 105929 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.223 105929 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.224 105929 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.224 105929 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.224 105929 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.224 105929 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.225 105929 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.226 105929 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.227 105929 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.228 105929 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.229 105929 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.230 105929 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.231 105929 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.232 105929 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.233 105929 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.234 105929 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.235 105929 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.235 105929 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.235 105929 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.235 105929 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.235 105929 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.236 105929 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.236 105929 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.236 105929 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.236 105929 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.236 105929 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.237 105929 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.238 105929 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.239 105929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.240 105929 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.241 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.242 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.242 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.242 105929 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.242 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.242 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.243 105929 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.244 105929 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.245 105929 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.246 105929 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.247 105929 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.248 105929 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.249 105929 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.250 105929 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.251 105929 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.252 105929 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.253 105929 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.254 105929 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.255 105929 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.256 105929 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.257 105929 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.258 105929 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.259 105929 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.259 105929 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.259 105929 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.259 105929 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.259 105929 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.260 105929 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.260 105929 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.260 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.260 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.260 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.261 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.262 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.262 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.262 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.262 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.262 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.263 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.264 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.265 105929 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.266 105929 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.266 105929 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:29:48 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:29:48.266 105929 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 26 20:29:52 compute-0 sshd-session[106481]: Accepted publickey for zuul from 192.168.122.30 port 37370 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:29:52 compute-0 systemd-logind[825]: New session 22 of user zuul.
Feb 26 20:29:52 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 26 20:29:52 compute-0 sshd-session[106481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:29:52 compute-0 podman[106484]: 2026-02-26 20:29:52.295340735 +0000 UTC m=+0.109339420 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:29:53 compute-0 python3.9[106661]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:29:54 compute-0 sudo[106815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovxcaohfvtcjdvemuzumdqsjtnyyrimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137793.7836564-29-76793029966498/AnsiballZ_command.py'
Feb 26 20:29:54 compute-0 sudo[106815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:54 compute-0 python3.9[106818]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:29:54 compute-0 sudo[106815]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:55 compute-0 sudo[106980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aikxfsivzrxhmzovjibbmhhhgfzuefmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137794.641738-40-186714662762657/AnsiballZ_systemd_service.py'
Feb 26 20:29:55 compute-0 sudo[106980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:55 compute-0 python3.9[106983]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:29:55 compute-0 systemd[1]: Reloading.
Feb 26 20:29:55 compute-0 systemd-rc-local-generator[107012]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:29:55 compute-0 systemd-sysv-generator[107015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:29:55 compute-0 sudo[106980]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:56 compute-0 python3.9[107176]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:29:56 compute-0 network[107193]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:29:56 compute-0 network[107194]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:29:56 compute-0 network[107195]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:29:58 compute-0 sudo[107455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idvumnsovoycgrbymixdxvyjwtvmajxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137798.420793-59-237138887229701/AnsiballZ_systemd_service.py'
Feb 26 20:29:58 compute-0 sudo[107455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:59 compute-0 python3.9[107458]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:29:59 compute-0 sudo[107455]: pam_unix(sudo:session): session closed for user root
Feb 26 20:29:59 compute-0 sudo[107609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjqkvbxycngbnuziwexzuifscmipdbfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137799.1766636-59-156489223793687/AnsiballZ_systemd_service.py'
Feb 26 20:29:59 compute-0 sudo[107609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:29:59 compute-0 python3.9[107612]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:29:59 compute-0 sudo[107609]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:00 compute-0 sudo[107763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roajptltzejndtqhzdjtpgokcrgujwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137799.919748-59-49192690484611/AnsiballZ_systemd_service.py'
Feb 26 20:30:00 compute-0 sudo[107763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:00 compute-0 python3.9[107766]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:30:00 compute-0 sudo[107763]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:00 compute-0 sudo[107917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qncfhmtgenxasfvbzotquekefigdmrao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137800.6707652-59-181704313631160/AnsiballZ_systemd_service.py'
Feb 26 20:30:00 compute-0 sudo[107917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:01 compute-0 python3.9[107920]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:30:01 compute-0 sudo[107917]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:01 compute-0 sudo[108071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smnbcjlcdqrbmkydqygdkazrgtusrnof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137801.4328668-59-77921442185998/AnsiballZ_systemd_service.py'
Feb 26 20:30:01 compute-0 sudo[108071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:02 compute-0 python3.9[108074]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:30:02 compute-0 sudo[108071]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:02 compute-0 sudo[108225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogwnmiwnkopgmsxyslysqnywghqbfmpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137802.2541535-59-22950015926833/AnsiballZ_systemd_service.py'
Feb 26 20:30:02 compute-0 sudo[108225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:02 compute-0 python3.9[108228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:30:02 compute-0 sudo[108225]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:03 compute-0 sudo[108379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tchokbmxxjpaucqyaqhrbkrzbifouvqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137802.8909957-59-264261143676164/AnsiballZ_systemd_service.py'
Feb 26 20:30:03 compute-0 sudo[108379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:03 compute-0 python3.9[108382]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:30:03 compute-0 sudo[108379]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:04 compute-0 sudo[108533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcnfvsgkkvuiavllpjoanshzwzojrmnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137803.6813464-111-129337925220677/AnsiballZ_file.py'
Feb 26 20:30:04 compute-0 sudo[108533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:04 compute-0 python3.9[108536]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:04 compute-0 sudo[108533]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:04 compute-0 sudo[108686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqejjhuwhdnuxobqjxnimtmoekmbzcaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137804.4125137-111-230571773494077/AnsiballZ_file.py'
Feb 26 20:30:04 compute-0 sudo[108686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:04 compute-0 python3.9[108689]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:04 compute-0 sudo[108686]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:05 compute-0 sudo[108839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswmkkatvyrseztufxyqtxrnvowmjttl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137805.021513-111-12390134751027/AnsiballZ_file.py'
Feb 26 20:30:05 compute-0 sudo[108839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:05 compute-0 python3.9[108842]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:05 compute-0 sudo[108839]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:05 compute-0 sudo[108992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeoeadxntkbjwigjnptlzvcncnajecaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137805.493306-111-11510884281535/AnsiballZ_file.py'
Feb 26 20:30:05 compute-0 sudo[108992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:05 compute-0 python3.9[108995]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:05 compute-0 sudo[108992]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:06 compute-0 sudo[109145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jovcembkuerlotiturazstxdcxqusjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137806.004023-111-141377807367452/AnsiballZ_file.py'
Feb 26 20:30:06 compute-0 sudo[109145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:06 compute-0 python3.9[109148]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:06 compute-0 sudo[109145]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:06 compute-0 sudo[109298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaofagefcqhmbbjqtpwgtvyhsrcqbqwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137806.569073-111-95144913687171/AnsiballZ_file.py'
Feb 26 20:30:06 compute-0 sudo[109298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:07 compute-0 python3.9[109301]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:07 compute-0 sudo[109298]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:07 compute-0 sudo[109451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luycoqsydfbvipmatajtyhtfwpqermhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137807.2517111-111-54619288323168/AnsiballZ_file.py'
Feb 26 20:30:07 compute-0 sudo[109451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:07 compute-0 python3.9[109454]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:07 compute-0 sudo[109451]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:08 compute-0 sudo[109604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddowjlanxwoeuslipjanjkbomwritxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137807.7955153-161-48821964109375/AnsiballZ_file.py'
Feb 26 20:30:08 compute-0 sudo[109604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:08 compute-0 python3.9[109607]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:08 compute-0 sudo[109604]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:08 compute-0 sudo[109757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypseumimuszqjoovqqdetkowueizqslj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137808.3148127-161-118453966149131/AnsiballZ_file.py'
Feb 26 20:30:08 compute-0 sudo[109757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:08 compute-0 python3.9[109760]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:08 compute-0 sudo[109757]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:09 compute-0 sudo[109910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpkkfwqsnmheayzgthigsejwvugsuqmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137808.8416839-161-18972307825084/AnsiballZ_file.py'
Feb 26 20:30:09 compute-0 sudo[109910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:09 compute-0 python3.9[109913]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:09 compute-0 sudo[109910]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:09 compute-0 sudo[110063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfjkclnxznsetmgqlqrhbnmftmpowxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137809.306096-161-240278710578683/AnsiballZ_file.py'
Feb 26 20:30:09 compute-0 sudo[110063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:09 compute-0 python3.9[110066]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:09 compute-0 sudo[110063]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:10 compute-0 sudo[110216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifpixyjoivztbmkjjxqgynomabxvgzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137809.803404-161-47252161765990/AnsiballZ_file.py'
Feb 26 20:30:10 compute-0 sudo[110216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:10 compute-0 python3.9[110219]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:10 compute-0 sudo[110216]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:10 compute-0 sudo[110369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjaurrjriwrnandsyimtpwenqejzzbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137810.3388638-161-30038589281963/AnsiballZ_file.py'
Feb 26 20:30:10 compute-0 sudo[110369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:10 compute-0 python3.9[110372]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:10 compute-0 sudo[110369]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:11 compute-0 sudo[110522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaosxvbartpddunrqxukabvlghfdnwvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137810.9367404-161-53722899694730/AnsiballZ_file.py'
Feb 26 20:30:11 compute-0 sudo[110522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:11 compute-0 python3.9[110525]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:30:11 compute-0 sudo[110522]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:11 compute-0 sudo[110675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtyfqknceadbueyixhyjcjbwucjdkmkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137811.66339-212-113194054937817/AnsiballZ_command.py'
Feb 26 20:30:11 compute-0 sudo[110675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:12 compute-0 python3.9[110678]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:12 compute-0 sudo[110675]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:12 compute-0 python3.9[110830]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:30:13 compute-0 sudo[110980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chyeqkjgipmztakqxqebhhmkaoeeueap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137813.2121887-230-59759493445474/AnsiballZ_systemd_service.py'
Feb 26 20:30:13 compute-0 sudo[110980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:13 compute-0 python3.9[110983]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:30:13 compute-0 systemd[1]: Reloading.
Feb 26 20:30:13 compute-0 systemd-rc-local-generator[111011]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:30:13 compute-0 systemd-sysv-generator[111017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:30:13 compute-0 sudo[110980]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:14 compute-0 sudo[111175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhanrytbmnehtzrrfedvlqrraljhtbjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137814.0598526-238-99461477845363/AnsiballZ_command.py'
Feb 26 20:30:14 compute-0 sudo[111175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:14 compute-0 python3.9[111178]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:14 compute-0 sudo[111175]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:14 compute-0 sudo[111329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuqmlyenqbaqycqffutzucgrkcdzlwth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137814.6193728-238-149126225851742/AnsiballZ_command.py'
Feb 26 20:30:14 compute-0 sudo[111329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:14 compute-0 podman[111331]: 2026-02-26 20:30:14.921654164 +0000 UTC m=+0.050766080 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 26 20:30:15 compute-0 python3.9[111333]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:15 compute-0 sudo[111329]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:15 compute-0 sudo[111504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvdnqjkxobkrakxdswtjvcbkygqswyxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137815.2422438-238-39182631207333/AnsiballZ_command.py'
Feb 26 20:30:15 compute-0 sudo[111504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:15 compute-0 python3.9[111507]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:15 compute-0 sudo[111504]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:16 compute-0 sudo[111658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wovnavszhikoikrfdvpquxvtuitvtwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137815.7925246-238-194886566550516/AnsiballZ_command.py'
Feb 26 20:30:16 compute-0 sudo[111658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:16 compute-0 python3.9[111661]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:16 compute-0 sudo[111658]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:16 compute-0 sudo[111812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwmabduxfcrbzzurhochkdqohsuciwon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137816.3359776-238-269287619445057/AnsiballZ_command.py'
Feb 26 20:30:16 compute-0 sudo[111812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:16 compute-0 python3.9[111815]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:16 compute-0 sudo[111812]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:17 compute-0 sudo[111966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvmjisbijhuxrbtxalfhvrprkczdksdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137816.898494-238-76943153374440/AnsiballZ_command.py'
Feb 26 20:30:17 compute-0 sudo[111966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:17 compute-0 python3.9[111969]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:17 compute-0 sudo[111966]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:17 compute-0 sudo[112120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdiyoavlteuqltceofbazdmrsjzmacf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137817.4207442-238-788112898414/AnsiballZ_command.py'
Feb 26 20:30:17 compute-0 sudo[112120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:17 compute-0 python3.9[112123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:30:17 compute-0 sudo[112120]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:18 compute-0 sudo[112274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trgrnybevvyxygpxntbbxnnazvoyiocy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137818.2257795-292-73260762512087/AnsiballZ_getent.py'
Feb 26 20:30:18 compute-0 sudo[112274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:18 compute-0 python3.9[112277]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 26 20:30:18 compute-0 sudo[112274]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:19 compute-0 sudo[112428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzoqncxgqvxcdepwwqlwiyzrjlvusfhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137819.1443677-300-239690990587548/AnsiballZ_group.py'
Feb 26 20:30:19 compute-0 sudo[112428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:19 compute-0 python3.9[112431]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:30:19 compute-0 groupadd[112432]: group added to /etc/group: name=libvirt, GID=42473
Feb 26 20:30:19 compute-0 groupadd[112432]: group added to /etc/gshadow: name=libvirt
Feb 26 20:30:19 compute-0 groupadd[112432]: new group: name=libvirt, GID=42473
Feb 26 20:30:19 compute-0 sudo[112428]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:20 compute-0 sudo[112587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afojxgubcfrllgammqksgtuuimcechfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137819.9834344-308-213468451209974/AnsiballZ_user.py'
Feb 26 20:30:20 compute-0 sudo[112587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:20 compute-0 python3.9[112590]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 26 20:30:20 compute-0 useradd[112592]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 26 20:30:20 compute-0 sudo[112587]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:21 compute-0 sudo[112748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuhztnbokjdnppnhomhindvmtwfkpoqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137821.0387936-319-64070849740505/AnsiballZ_setup.py'
Feb 26 20:30:21 compute-0 sudo[112748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:21 compute-0 python3.9[112751]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:30:21 compute-0 sudo[112748]: pam_unix(sudo:session): session closed for user root
Feb 26 20:30:22 compute-0 sudo[112833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thyjiicvzuwabaqiqrumeksavqldpkna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137821.0387936-319-64070849740505/AnsiballZ_dnf.py'
Feb 26 20:30:22 compute-0 sudo[112833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:30:22 compute-0 python3.9[112836]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:30:22 compute-0 podman[112838]: 2026-02-26 20:30:22.55544409 +0000 UTC m=+0.070340967 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0)
Feb 26 20:30:45 compute-0 podman[113052]: 2026-02-26 20:30:45.576022037 +0000 UTC m=+0.076411711 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 26 20:30:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:30:46.489 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:30:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:30:46.492 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:30:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:30:46.492 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:30:48 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:30:48 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:30:53 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 26 20:30:53 compute-0 podman[113080]: 2026-02-26 20:30:53.609606817 +0000 UTC m=+0.106861856 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 26 20:30:58 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:30:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:31:16 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 26 20:31:16 compute-0 podman[118237]: 2026-02-26 20:31:16.5912016 +0000 UTC m=+0.077618931 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 26 20:31:24 compute-0 podman[124750]: 2026-02-26 20:31:24.56003411 +0000 UTC m=+0.064383565 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 26 20:31:44 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 26 20:31:44 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 26 20:31:45 compute-0 groupadd[130076]: group added to /etc/group: name=dnsmasq, GID=993
Feb 26 20:31:45 compute-0 groupadd[130076]: group added to /etc/gshadow: name=dnsmasq
Feb 26 20:31:45 compute-0 groupadd[130076]: new group: name=dnsmasq, GID=993
Feb 26 20:31:45 compute-0 useradd[130083]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 26 20:31:45 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:31:45 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 26 20:31:45 compute-0 dbus-broker-launch[785]: Noticed file-system modification, trigger reload.
Feb 26 20:31:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:31:46.490 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:31:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:31:46.491 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:31:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:31:46.492 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:31:46 compute-0 groupadd[130096]: group added to /etc/group: name=clevis, GID=992
Feb 26 20:31:46 compute-0 groupadd[130096]: group added to /etc/gshadow: name=clevis
Feb 26 20:31:46 compute-0 groupadd[130096]: new group: name=clevis, GID=992
Feb 26 20:31:46 compute-0 useradd[130104]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 26 20:31:46 compute-0 usermod[130131]: add 'clevis' to group 'tss'
Feb 26 20:31:46 compute-0 usermod[130131]: add 'clevis' to shadow group 'tss'
Feb 26 20:31:46 compute-0 podman[130101]: 2026-02-26 20:31:46.738526564 +0000 UTC m=+0.101766942 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 26 20:31:48 compute-0 polkitd[44348]: Reloading rules
Feb 26 20:31:48 compute-0 polkitd[44348]: Collecting garbage unconditionally...
Feb 26 20:31:48 compute-0 polkitd[44348]: Loading rules from directory /etc/polkit-1/rules.d
Feb 26 20:31:48 compute-0 polkitd[44348]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 26 20:31:48 compute-0 polkitd[44348]: Finished loading, compiling and executing 3 rules
Feb 26 20:31:48 compute-0 polkitd[44348]: Reloading rules
Feb 26 20:31:48 compute-0 polkitd[44348]: Collecting garbage unconditionally...
Feb 26 20:31:48 compute-0 polkitd[44348]: Loading rules from directory /etc/polkit-1/rules.d
Feb 26 20:31:48 compute-0 polkitd[44348]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 26 20:31:48 compute-0 polkitd[44348]: Finished loading, compiling and executing 3 rules
Feb 26 20:31:50 compute-0 groupadd[130323]: group added to /etc/group: name=ceph, GID=167
Feb 26 20:31:50 compute-0 groupadd[130323]: group added to /etc/gshadow: name=ceph
Feb 26 20:31:50 compute-0 groupadd[130323]: new group: name=ceph, GID=167
Feb 26 20:31:50 compute-0 useradd[130329]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 26 20:31:52 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 26 20:31:52 compute-0 sshd[1017]: Received signal 15; terminating.
Feb 26 20:31:52 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 26 20:31:52 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 26 20:31:52 compute-0 systemd[1]: sshd.service: Consumed 1.232s CPU time, read 32.0K from disk, written 0B to disk.
Feb 26 20:31:52 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 26 20:31:52 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 26 20:31:52 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 20:31:52 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 20:31:52 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 26 20:31:52 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 26 20:31:52 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 26 20:31:52 compute-0 sshd[130848]: Server listening on 0.0.0.0 port 22.
Feb 26 20:31:52 compute-0 sshd[130848]: Server listening on :: port 22.
Feb 26 20:31:52 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 26 20:31:54 compute-0 podman[131044]: 2026-02-26 20:31:54.69051685 +0000 UTC m=+0.097757803 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 26 20:31:54 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:31:54 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:31:54 compute-0 systemd[1]: Reloading.
Feb 26 20:31:54 compute-0 systemd-rc-local-generator[131132]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:31:55 compute-0 systemd-sysv-generator[131135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:31:55 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:31:57 compute-0 sudo[112833]: pam_unix(sudo:session): session closed for user root
Feb 26 20:31:58 compute-0 sudo[135355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpevrpkoltemrfhvwxmaluuzmhgprioo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137917.6001158-331-998667192467/AnsiballZ_systemd.py'
Feb 26 20:31:58 compute-0 sudo[135355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:31:58 compute-0 python3.9[135382]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:31:58 compute-0 systemd[1]: Reloading.
Feb 26 20:31:58 compute-0 systemd-rc-local-generator[135980]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:31:58 compute-0 systemd-sysv-generator[135986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:31:58 compute-0 sudo[135355]: pam_unix(sudo:session): session closed for user root
Feb 26 20:31:59 compute-0 sudo[136904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etesehkamqwfiptywxtcyasvwqbckodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137918.9813354-331-178648710735977/AnsiballZ_systemd.py'
Feb 26 20:31:59 compute-0 sudo[136904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:31:59 compute-0 python3.9[136928]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:31:59 compute-0 systemd[1]: Reloading.
Feb 26 20:31:59 compute-0 systemd-sysv-generator[137409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:31:59 compute-0 systemd-rc-local-generator[137405]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:31:59 compute-0 sudo[136904]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:00 compute-0 sudo[138230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvycdrfsgfafpqmjhcuheolucnaewacz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137919.9280794-331-123035105761697/AnsiballZ_systemd.py'
Feb 26 20:32:00 compute-0 sudo[138230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:00 compute-0 python3.9[138257]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:32:00 compute-0 systemd[1]: Reloading.
Feb 26 20:32:00 compute-0 systemd-rc-local-generator[138685]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:00 compute-0 systemd-sysv-generator[138688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:00 compute-0 sudo[138230]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:01 compute-0 sudo[139497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgphvtwtupyjheahcgyjzzlayymydpaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137920.982601-331-95127518823810/AnsiballZ_systemd.py'
Feb 26 20:32:01 compute-0 sudo[139497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:01 compute-0 python3.9[139508]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:32:01 compute-0 systemd[1]: Reloading.
Feb 26 20:32:01 compute-0 systemd-sysv-generator[140027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:01 compute-0 systemd-rc-local-generator[140019]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:01 compute-0 sudo[139497]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:02 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:32:02 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:32:02 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.209s CPU time.
Feb 26 20:32:02 compute-0 systemd[1]: run-ra02209ba8ee5418eb38bd577e19c3a0c.service: Deactivated successfully.
Feb 26 20:32:02 compute-0 sudo[140481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqjctjhgutkyuofuwimeueuxxmvuwpvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137922.111397-360-4023654899905/AnsiballZ_systemd.py'
Feb 26 20:32:02 compute-0 sudo[140481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:02 compute-0 python3.9[140484]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:02 compute-0 systemd[1]: Reloading.
Feb 26 20:32:02 compute-0 systemd-rc-local-generator[140516]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:02 compute-0 systemd-sysv-generator[140519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:03 compute-0 sudo[140481]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:03 compute-0 sudo[140679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezusattpmyxekhzgnfvwxmlqwxmzehc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137923.2096422-360-145229350792493/AnsiballZ_systemd.py'
Feb 26 20:32:03 compute-0 sudo[140679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:03 compute-0 python3.9[140682]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:03 compute-0 systemd[1]: Reloading.
Feb 26 20:32:03 compute-0 systemd-rc-local-generator[140715]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:03 compute-0 systemd-sysv-generator[140719]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:04 compute-0 sudo[140679]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:04 compute-0 sudo[140878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swtdqvzbovqwsjghnskruvercqolvfzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137924.3167431-360-79081705784347/AnsiballZ_systemd.py'
Feb 26 20:32:04 compute-0 sudo[140878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:04 compute-0 python3.9[140881]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:04 compute-0 systemd[1]: Reloading.
Feb 26 20:32:04 compute-0 systemd-rc-local-generator[140907]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:04 compute-0 systemd-sysv-generator[140910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:05 compute-0 sudo[140878]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:05 compute-0 sudo[141075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btkrosjynocmnglppjsqyybcswjupnvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137925.3102682-360-111556422069895/AnsiballZ_systemd.py'
Feb 26 20:32:05 compute-0 sudo[141075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:05 compute-0 python3.9[141078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:06 compute-0 sudo[141075]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:06 compute-0 sudo[141231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dluarorbvbddsuvipivgmzmyeoyvpqhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137926.2286184-360-123149597498553/AnsiballZ_systemd.py'
Feb 26 20:32:06 compute-0 sudo[141231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:06 compute-0 python3.9[141234]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:06 compute-0 systemd[1]: Reloading.
Feb 26 20:32:07 compute-0 systemd-rc-local-generator[141265]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:07 compute-0 systemd-sysv-generator[141268]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:07 compute-0 sudo[141231]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:07 compute-0 sudo[141428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwtuuhagfxzyemikvgliostwpilzmpng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137927.3872411-396-66256024706646/AnsiballZ_systemd.py'
Feb 26 20:32:07 compute-0 sudo[141428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:07 compute-0 python3.9[141431]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 26 20:32:08 compute-0 systemd[1]: Reloading.
Feb 26 20:32:08 compute-0 systemd-rc-local-generator[141462]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:32:08 compute-0 systemd-sysv-generator[141467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:32:08 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 26 20:32:08 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 26 20:32:08 compute-0 sudo[141428]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:08 compute-0 sudo[141630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmlorvaphfynhppzrodldaywzbxleugl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137928.5552142-404-101668201447265/AnsiballZ_systemd.py'
Feb 26 20:32:08 compute-0 sudo[141630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:09 compute-0 python3.9[141633]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:09 compute-0 sudo[141630]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:09 compute-0 sudo[141786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oroypsnjmxporczggxpaedhjlbpfpyqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137929.3762414-404-222139334375196/AnsiballZ_systemd.py'
Feb 26 20:32:09 compute-0 sudo[141786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:09 compute-0 python3.9[141789]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:10 compute-0 sudo[141786]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:10 compute-0 sudo[141942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awavvuysegsoppfefyrykmwaxwlspzgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137930.1729176-404-281248569457591/AnsiballZ_systemd.py'
Feb 26 20:32:10 compute-0 sudo[141942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:10 compute-0 python3.9[141945]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:10 compute-0 sudo[141942]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:11 compute-0 sudo[142098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npsuqtcnecboayazwnhckmidiipsnaww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137930.9675066-404-52133058118611/AnsiballZ_systemd.py'
Feb 26 20:32:11 compute-0 sudo[142098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:11 compute-0 python3.9[142101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:11 compute-0 sudo[142098]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:11 compute-0 sudo[142254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfupwsrneldlqxrovhcbhuxfjnxfeyim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137931.7267249-404-25444994940193/AnsiballZ_systemd.py'
Feb 26 20:32:11 compute-0 sudo[142254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:12 compute-0 python3.9[142257]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:12 compute-0 sudo[142254]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:12 compute-0 sudo[142410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvrdbakdwbwathbmsygigdyrxqrckmuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137932.4672928-404-234876495621383/AnsiballZ_systemd.py'
Feb 26 20:32:12 compute-0 sudo[142410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:13 compute-0 python3.9[142413]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:13 compute-0 sudo[142410]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:13 compute-0 sudo[142566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziozeephxfdxcxjrgkyrcpiwnhbgidxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137933.232974-404-176603078515608/AnsiballZ_systemd.py'
Feb 26 20:32:13 compute-0 sudo[142566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:13 compute-0 python3.9[142569]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:13 compute-0 sudo[142566]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:14 compute-0 sudo[142722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anvidcmvnchfquvgagjfgtxomutcinzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137933.9917564-404-72987397474671/AnsiballZ_systemd.py'
Feb 26 20:32:14 compute-0 sudo[142722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:14 compute-0 python3.9[142725]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:14 compute-0 sudo[142722]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:15 compute-0 sudo[142878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzvihcryfomhsingyknldhceuizaowht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137934.7874103-404-55179610532608/AnsiballZ_systemd.py'
Feb 26 20:32:15 compute-0 sudo[142878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:15 compute-0 python3.9[142881]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:15 compute-0 sudo[142878]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:15 compute-0 sudo[143034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzbnnqebxmsogueamntoousrxxvgaioh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137935.5746188-404-186197065306432/AnsiballZ_systemd.py'
Feb 26 20:32:15 compute-0 sudo[143034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:16 compute-0 python3.9[143037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:16 compute-0 sudo[143034]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:16 compute-0 sudo[143190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnzjnommxdfaptyisyktpxvtysjfwmke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137936.356717-404-265677600400367/AnsiballZ_systemd.py'
Feb 26 20:32:16 compute-0 sudo[143190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:17 compute-0 python3.9[143193]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:17 compute-0 sudo[143190]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:17 compute-0 podman[143195]: 2026-02-26 20:32:17.120185665 +0000 UTC m=+0.072182502 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 26 20:32:17 compute-0 sudo[143366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctyhdegtfqziilmuxfzoniexgrmdqdrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137937.2367606-404-48607014696312/AnsiballZ_systemd.py'
Feb 26 20:32:17 compute-0 sudo[143366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:17 compute-0 python3.9[143369]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:17 compute-0 sudo[143366]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:18 compute-0 sudo[143522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fukijqwwtsqidmzokipjjnfxqpqgngxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137938.0396378-404-173127314206810/AnsiballZ_systemd.py'
Feb 26 20:32:18 compute-0 sudo[143522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:18 compute-0 python3.9[143525]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:18 compute-0 sudo[143522]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:19 compute-0 sudo[143678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjaofevzbksvvdkcwdgprpnitpumjkrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137938.8927586-404-36631050566282/AnsiballZ_systemd.py'
Feb 26 20:32:19 compute-0 sudo[143678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:19 compute-0 python3.9[143681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 26 20:32:19 compute-0 sudo[143678]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:20 compute-0 sudo[143834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgubacggqxalcgvxqtoqnuscjzcirjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137940.0068262-506-182839516977647/AnsiballZ_file.py'
Feb 26 20:32:20 compute-0 sudo[143834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:20 compute-0 python3.9[143837]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:20 compute-0 sudo[143834]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:20 compute-0 sudo[143987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xychploncywbahynuykgjakxpxzcorbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137940.6316218-506-45127589870368/AnsiballZ_file.py'
Feb 26 20:32:20 compute-0 sudo[143987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:21 compute-0 python3.9[143990]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:21 compute-0 sudo[143987]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:21 compute-0 sudo[144140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgyhqhcnlludtpdkdsjumcyhdtfgtvaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137941.2700598-506-219201666999500/AnsiballZ_file.py'
Feb 26 20:32:21 compute-0 sudo[144140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:21 compute-0 python3.9[144143]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:21 compute-0 sudo[144140]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:22 compute-0 sudo[144293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-safhwpdidkomgxvcphwiigpjykzdxmzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137941.8874197-506-195633462293347/AnsiballZ_file.py'
Feb 26 20:32:22 compute-0 sudo[144293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:22 compute-0 python3.9[144296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:22 compute-0 sudo[144293]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:22 compute-0 sudo[144446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdthqilsinahhsttjxpoppmggblzelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137942.5557957-506-211928038002841/AnsiballZ_file.py'
Feb 26 20:32:22 compute-0 sudo[144446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:23 compute-0 python3.9[144449]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:23 compute-0 sudo[144446]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:23 compute-0 sudo[144599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-almafviscuntrsvencjmnerfxbllifcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137943.1478326-506-116342273646872/AnsiballZ_file.py'
Feb 26 20:32:23 compute-0 sudo[144599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:23 compute-0 python3.9[144602]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:32:23 compute-0 sudo[144599]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:24 compute-0 python3.9[144752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:32:25 compute-0 sudo[144917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizopxltodrrsjnrbowkjhtljseafkxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137944.577612-557-182413673275593/AnsiballZ_stat.py'
Feb 26 20:32:25 compute-0 sudo[144917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:25 compute-0 podman[144876]: 2026-02-26 20:32:25.06228462 +0000 UTC m=+0.077166934 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:32:25 compute-0 python3.9[144925]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:25 compute-0 sudo[144917]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:25 compute-0 sudo[145054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flbqytmzqhgvrstnrnikcsesmqjkjbvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137944.577612-557-182413673275593/AnsiballZ_copy.py'
Feb 26 20:32:25 compute-0 sudo[145054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:25 compute-0 python3.9[145057]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137944.577612-557-182413673275593/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:25 compute-0 sudo[145054]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:26 compute-0 sudo[145207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgcxifzbwrarduttlnwxqjjghuurjoco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137946.0466945-557-207030561960593/AnsiballZ_stat.py'
Feb 26 20:32:26 compute-0 sudo[145207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:26 compute-0 python3.9[145210]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:26 compute-0 sudo[145207]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:26 compute-0 sudo[145333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbkqoqroenvwtxgutayjuufvlmmakglb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137946.0466945-557-207030561960593/AnsiballZ_copy.py'
Feb 26 20:32:26 compute-0 sudo[145333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:27 compute-0 python3.9[145336]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137946.0466945-557-207030561960593/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:27 compute-0 sudo[145333]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:27 compute-0 sudo[145486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfzorfrfvaowilmqstyyiifovlssdgwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137947.2852812-557-36681597930901/AnsiballZ_stat.py'
Feb 26 20:32:27 compute-0 sudo[145486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:27 compute-0 python3.9[145489]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:27 compute-0 sudo[145486]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:28 compute-0 sudo[145612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfjghngbcugypepxnkhpanpdiaskuiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137947.2852812-557-36681597930901/AnsiballZ_copy.py'
Feb 26 20:32:28 compute-0 sudo[145612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:28 compute-0 python3.9[145615]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137947.2852812-557-36681597930901/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:28 compute-0 sudo[145612]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:28 compute-0 sudo[145765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzqbqosqprohzczpedvhrjcczolvyilj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137948.3928168-557-192852898648263/AnsiballZ_stat.py'
Feb 26 20:32:28 compute-0 sudo[145765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:28 compute-0 python3.9[145768]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:28 compute-0 sudo[145765]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:29 compute-0 sudo[145891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wolnyocxskikiwffaeferpmukigjzpvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137948.3928168-557-192852898648263/AnsiballZ_copy.py'
Feb 26 20:32:29 compute-0 sudo[145891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:29 compute-0 python3.9[145894]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137948.3928168-557-192852898648263/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:29 compute-0 sudo[145891]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:29 compute-0 sudo[146044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgobisbytdkvfofqzgypykancjbmhqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137949.5193458-557-66041476491551/AnsiballZ_stat.py'
Feb 26 20:32:29 compute-0 sudo[146044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:29 compute-0 python3.9[146047]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:29 compute-0 sudo[146044]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:30 compute-0 sudo[146170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djqwyagvnwygzauvuyyrxzmfqgxemcfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137949.5193458-557-66041476491551/AnsiballZ_copy.py'
Feb 26 20:32:30 compute-0 sudo[146170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:30 compute-0 python3.9[146173]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137949.5193458-557-66041476491551/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:30 compute-0 sudo[146170]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:30 compute-0 sudo[146323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejuttyctfesvhwwabrbspgecphdxgham ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137950.569462-557-17057151470042/AnsiballZ_stat.py'
Feb 26 20:32:30 compute-0 sudo[146323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:31 compute-0 python3.9[146326]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:31 compute-0 sudo[146323]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:31 compute-0 sudo[146449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzeswwxvsgzmjylcqwekvpbfxhjsfsox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137950.569462-557-17057151470042/AnsiballZ_copy.py'
Feb 26 20:32:31 compute-0 sudo[146449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:31 compute-0 python3.9[146452]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137950.569462-557-17057151470042/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:31 compute-0 sudo[146449]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:32 compute-0 sudo[146602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeakwvlacpwmjmrajtrezkmhrwpzmbcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137951.7602882-557-233389007313615/AnsiballZ_stat.py'
Feb 26 20:32:32 compute-0 sudo[146602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:32 compute-0 python3.9[146605]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:32 compute-0 sudo[146602]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:32 compute-0 sudo[146726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwxizptvruawzfufsedijvjapvddwncg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137951.7602882-557-233389007313615/AnsiballZ_copy.py'
Feb 26 20:32:32 compute-0 sudo[146726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:32 compute-0 python3.9[146729]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137951.7602882-557-233389007313615/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:32 compute-0 sudo[146726]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:33 compute-0 sudo[146879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sntklyelqxevggazunntiekwajfauvme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137953.062089-557-138851895572764/AnsiballZ_stat.py'
Feb 26 20:32:33 compute-0 sudo[146879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:33 compute-0 python3.9[146882]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:33 compute-0 sudo[146879]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:33 compute-0 sudo[147005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzmxmigdnsyriaofohexfdbehscicsyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137953.062089-557-138851895572764/AnsiballZ_copy.py'
Feb 26 20:32:33 compute-0 sudo[147005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:33 compute-0 python3.9[147008]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772137953.062089-557-138851895572764/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:33 compute-0 sudo[147005]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:34 compute-0 sudo[147158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giiucwfiwdmfbjgxossacpgrudwdcjzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137954.163076-670-75644286797454/AnsiballZ_command.py'
Feb 26 20:32:34 compute-0 sudo[147158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:34 compute-0 python3.9[147161]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 26 20:32:34 compute-0 sudo[147158]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:35 compute-0 sudo[147312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwzugileuegopvpdkymnthjzdktrjhsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137954.801158-679-121108344143690/AnsiballZ_file.py'
Feb 26 20:32:35 compute-0 sudo[147312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:35 compute-0 python3.9[147315]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:35 compute-0 sudo[147312]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:35 compute-0 sudo[147465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwpywfnohdygqgkvahxbzqoiypyzrxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137955.6588316-679-163351692907503/AnsiballZ_file.py'
Feb 26 20:32:35 compute-0 sudo[147465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:36 compute-0 python3.9[147468]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:36 compute-0 sudo[147465]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:36 compute-0 sudo[147618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umdzdgupvatpgshjaoetqvmgyzoiiely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137956.3055043-679-28011801322152/AnsiballZ_file.py'
Feb 26 20:32:36 compute-0 sudo[147618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:36 compute-0 python3.9[147621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:36 compute-0 sudo[147618]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:37 compute-0 sudo[147771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjkgxxjpylukjhllcigpiphcrjgifqex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137956.9595022-679-14967290610653/AnsiballZ_file.py'
Feb 26 20:32:37 compute-0 sudo[147771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:37 compute-0 python3.9[147774]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:37 compute-0 sudo[147771]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:37 compute-0 sudo[147924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkngzswynbuqbuahsflcyerdotgiulsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137957.558158-679-242237966400704/AnsiballZ_file.py'
Feb 26 20:32:37 compute-0 sudo[147924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:37 compute-0 python3.9[147927]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:37 compute-0 sudo[147924]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:38 compute-0 sudo[148077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeqvpqqilezxhzduoejgiefsxyewmtfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137958.1405272-679-143356120885447/AnsiballZ_file.py'
Feb 26 20:32:38 compute-0 sudo[148077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:38 compute-0 python3.9[148080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:38 compute-0 sudo[148077]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:38 compute-0 sudo[148230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdiwhcdclhbevxswhndhqseulysmbzlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137958.7538233-679-200089076258711/AnsiballZ_file.py'
Feb 26 20:32:38 compute-0 sudo[148230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:39 compute-0 python3.9[148233]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:39 compute-0 sudo[148230]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:39 compute-0 sudo[148383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oudqiiyuyloyllxnfdbzquqmxbcejaaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137959.3154416-679-275108226791733/AnsiballZ_file.py'
Feb 26 20:32:39 compute-0 sudo[148383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:39 compute-0 python3.9[148386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:39 compute-0 sudo[148383]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:40 compute-0 sudo[148536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asqltjrtbmsaxlgjzsiqqffjxerrwjau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137959.9038236-679-257397346665753/AnsiballZ_file.py'
Feb 26 20:32:40 compute-0 sudo[148536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:40 compute-0 python3.9[148539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:40 compute-0 sudo[148536]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:40 compute-0 sudo[148689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwumrpfoeyzcvlszlhqtkbnhllehpkbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137960.507208-679-171318173320705/AnsiballZ_file.py'
Feb 26 20:32:40 compute-0 sudo[148689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:40 compute-0 python3.9[148692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:40 compute-0 sudo[148689]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:41 compute-0 sudo[148842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcsprvckjcuelsigdbhsvuivntowzsmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137961.0814738-679-112792497414046/AnsiballZ_file.py'
Feb 26 20:32:41 compute-0 sudo[148842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:41 compute-0 python3.9[148845]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:41 compute-0 sudo[148842]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:41 compute-0 sudo[148995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qonactnfytbwykdntylwctdsrimmyoge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137961.6490228-679-279882387971111/AnsiballZ_file.py'
Feb 26 20:32:41 compute-0 sudo[148995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:42 compute-0 python3.9[148998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:42 compute-0 sudo[148995]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:42 compute-0 sudo[149148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qozxsddlvnsisgifgbeewunenltacdkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137962.2207863-679-259350343256269/AnsiballZ_file.py'
Feb 26 20:32:42 compute-0 sudo[149148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:42 compute-0 python3.9[149151]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:42 compute-0 sudo[149148]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:43 compute-0 sudo[149301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzmjbwodtjdelmnjzsudtrmcqgdsljiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137962.8159046-679-216463011325006/AnsiballZ_file.py'
Feb 26 20:32:43 compute-0 sudo[149301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:43 compute-0 python3.9[149304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:43 compute-0 sudo[149301]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:43 compute-0 sudo[149454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvyrzyzenzznbxvmspklqrvpkrinwnra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137963.4556303-778-183153818048570/AnsiballZ_stat.py'
Feb 26 20:32:43 compute-0 sudo[149454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:43 compute-0 python3.9[149457]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:44 compute-0 sudo[149454]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:44 compute-0 sudo[149578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtyciylwqweonqzaaidmenrmxknzgkou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137963.4556303-778-183153818048570/AnsiballZ_copy.py'
Feb 26 20:32:44 compute-0 sudo[149578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:44 compute-0 python3.9[149581]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137963.4556303-778-183153818048570/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:44 compute-0 sudo[149578]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:44 compute-0 sudo[149731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anudosvwhjfnqudimympgthnpohugjtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137964.7109647-778-140987651983331/AnsiballZ_stat.py'
Feb 26 20:32:44 compute-0 sudo[149731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:45 compute-0 python3.9[149734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:45 compute-0 sudo[149731]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:45 compute-0 sudo[149855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edrfulugqoiflpvnzeslxhaeeyhhnxum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137964.7109647-778-140987651983331/AnsiballZ_copy.py'
Feb 26 20:32:45 compute-0 sudo[149855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:45 compute-0 python3.9[149858]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137964.7109647-778-140987651983331/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:45 compute-0 sudo[149855]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:46 compute-0 sudo[150008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baqhjfvrvwqbaoocwenuzoylrcmsubwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137965.8627908-778-190765124766675/AnsiballZ_stat.py'
Feb 26 20:32:46 compute-0 sudo[150008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:46 compute-0 python3.9[150011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:46 compute-0 sudo[150008]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:32:46.491 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:32:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:32:46.492 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:32:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:32:46.492 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:32:46 compute-0 sudo[150132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htgtqpixkgtaphfijtorwyoxiwohibev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137965.8627908-778-190765124766675/AnsiballZ_copy.py'
Feb 26 20:32:46 compute-0 sudo[150132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:46 compute-0 python3.9[150135]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137965.8627908-778-190765124766675/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:46 compute-0 sudo[150132]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:47 compute-0 sudo[150285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshwvgxdrwlrfqzezynvqargbquqzsbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137966.8772347-778-188442183239032/AnsiballZ_stat.py'
Feb 26 20:32:47 compute-0 sudo[150285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:47 compute-0 python3.9[150288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:47 compute-0 sudo[150285]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:47 compute-0 podman[150289]: 2026-02-26 20:32:47.39579007 +0000 UTC m=+0.050612927 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:32:47 compute-0 sudo[150429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azmuyvlfbkulmisukhvqbgnxpldorogo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137966.8772347-778-188442183239032/AnsiballZ_copy.py'
Feb 26 20:32:47 compute-0 sudo[150429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:47 compute-0 python3.9[150432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137966.8772347-778-188442183239032/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:47 compute-0 sudo[150429]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:48 compute-0 sudo[150582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahrzbieporgbpmpkazusmiikifprzmch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137967.947519-778-138812322384326/AnsiballZ_stat.py'
Feb 26 20:32:48 compute-0 sudo[150582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:48 compute-0 python3.9[150585]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:48 compute-0 sudo[150582]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:48 compute-0 sudo[150706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxxuqjwzydhfdxlpgozfhtqsymqwiwfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137967.947519-778-138812322384326/AnsiballZ_copy.py'
Feb 26 20:32:48 compute-0 sudo[150706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:48 compute-0 python3.9[150709]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137967.947519-778-138812322384326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:48 compute-0 sudo[150706]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:49 compute-0 sudo[150859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwiwmuypievpyfzxrstjocszgvsgsfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137968.981952-778-127394536131774/AnsiballZ_stat.py'
Feb 26 20:32:49 compute-0 sudo[150859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:49 compute-0 python3.9[150862]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:49 compute-0 sudo[150859]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:49 compute-0 sudo[150983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfcedxiggpaclbhbavidhtngolfpxryp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137968.981952-778-127394536131774/AnsiballZ_copy.py'
Feb 26 20:32:49 compute-0 sudo[150983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:49 compute-0 python3.9[150986]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137968.981952-778-127394536131774/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:50 compute-0 sudo[150983]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:50 compute-0 sudo[151136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwlujrbjabwdoqueihlwcnsrdgqyplle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137970.1617916-778-78913098139346/AnsiballZ_stat.py'
Feb 26 20:32:50 compute-0 sudo[151136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:50 compute-0 python3.9[151139]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:50 compute-0 sudo[151136]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:50 compute-0 sudo[151260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsfazlcjiooujmvvkdeagqepwkwkggvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137970.1617916-778-78913098139346/AnsiballZ_copy.py'
Feb 26 20:32:50 compute-0 sudo[151260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:51 compute-0 python3.9[151263]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137970.1617916-778-78913098139346/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:51 compute-0 sudo[151260]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:51 compute-0 sudo[151413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hahxocyqeomqmaqhemriavamiotmqlww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137971.3565698-778-204023326734474/AnsiballZ_stat.py'
Feb 26 20:32:51 compute-0 sudo[151413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:51 compute-0 python3.9[151416]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:51 compute-0 sudo[151413]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:52 compute-0 sudo[151537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjdomlenvzaxxgaxgnqxyxggzwydrfvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137971.3565698-778-204023326734474/AnsiballZ_copy.py'
Feb 26 20:32:52 compute-0 sudo[151537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:52 compute-0 python3.9[151540]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137971.3565698-778-204023326734474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:52 compute-0 sudo[151537]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:52 compute-0 sudo[151690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztxsrynnagqfjmdvuqbteswxcwuyjgtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137972.496283-778-104811302670374/AnsiballZ_stat.py'
Feb 26 20:32:52 compute-0 sudo[151690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:52 compute-0 python3.9[151693]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:52 compute-0 sudo[151690]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:53 compute-0 sudo[151814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffpdskwnjshjkgtyzlavkqbpwtuuzzof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137972.496283-778-104811302670374/AnsiballZ_copy.py'
Feb 26 20:32:53 compute-0 sudo[151814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:53 compute-0 python3.9[151817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137972.496283-778-104811302670374/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:53 compute-0 sudo[151814]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:53 compute-0 sudo[151967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jirkyoneimrpqmhgeoahtammzougkdjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137973.58722-778-40739556556469/AnsiballZ_stat.py'
Feb 26 20:32:53 compute-0 sudo[151967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:54 compute-0 python3.9[151970]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:54 compute-0 sudo[151967]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:54 compute-0 sudo[152091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzyieabogbgrzbxeyjcwnysunvmbchnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137973.58722-778-40739556556469/AnsiballZ_copy.py'
Feb 26 20:32:54 compute-0 sudo[152091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:54 compute-0 python3.9[152094]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137973.58722-778-40739556556469/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:54 compute-0 sudo[152091]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:54 compute-0 sudo[152244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjjpqoceainpurdbacymoklmbexgsyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137974.7136202-778-43061987911700/AnsiballZ_stat.py'
Feb 26 20:32:54 compute-0 sudo[152244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:55 compute-0 python3.9[152247]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:55 compute-0 sudo[152244]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:55 compute-0 podman[152248]: 2026-02-26 20:32:55.312568121 +0000 UTC m=+0.102152388 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 26 20:32:55 compute-0 sudo[152396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhdcnaetgnwimrolvecvmfumwbapyijg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137974.7136202-778-43061987911700/AnsiballZ_copy.py'
Feb 26 20:32:55 compute-0 sudo[152396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:55 compute-0 python3.9[152399]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137974.7136202-778-43061987911700/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:55 compute-0 sudo[152396]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:56 compute-0 sudo[152549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prhwmgmlximjzgawbnxobralbfpfdshy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137975.8695161-778-71760405269986/AnsiballZ_stat.py'
Feb 26 20:32:56 compute-0 sudo[152549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:56 compute-0 python3.9[152552]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:56 compute-0 sudo[152549]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:56 compute-0 sudo[152673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aioavckzbbbqdrnskpyvtdvfgvxtdxhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137975.8695161-778-71760405269986/AnsiballZ_copy.py'
Feb 26 20:32:56 compute-0 sudo[152673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:56 compute-0 python3.9[152676]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137975.8695161-778-71760405269986/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:56 compute-0 sudo[152673]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:57 compute-0 sudo[152826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inrztgjramczegpndiukquyowlqfihbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137976.987394-778-22107360863474/AnsiballZ_stat.py'
Feb 26 20:32:57 compute-0 sudo[152826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:57 compute-0 python3.9[152829]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:57 compute-0 sudo[152826]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:57 compute-0 sudo[152950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqldkbggjzxcjgpdhuecupsprzoewiic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137976.987394-778-22107360863474/AnsiballZ_copy.py'
Feb 26 20:32:57 compute-0 sudo[152950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:57 compute-0 python3.9[152953]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137976.987394-778-22107360863474/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:58 compute-0 sudo[152950]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:58 compute-0 sudo[153103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbchuvnnyngcjkgxcztguhqewojyvtkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137978.113132-778-2464795481866/AnsiballZ_stat.py'
Feb 26 20:32:58 compute-0 sudo[153103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:58 compute-0 python3.9[153106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:32:58 compute-0 sudo[153103]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:59 compute-0 sudo[153227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kipvwvhjpvkewpuqyqqynhzwwcyuecln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137978.113132-778-2464795481866/AnsiballZ_copy.py'
Feb 26 20:32:59 compute-0 sudo[153227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:32:59 compute-0 python3.9[153230]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772137978.113132-778-2464795481866/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:32:59 compute-0 sudo[153227]: pam_unix(sudo:session): session closed for user root
Feb 26 20:32:59 compute-0 python3.9[153380]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:00 compute-0 sudo[153533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumbfuxvbdipbploedhzxjhoigihprql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137980.1662123-984-106069876811535/AnsiballZ_seboolean.py'
Feb 26 20:33:00 compute-0 sudo[153533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:00 compute-0 python3.9[153536]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 26 20:33:01 compute-0 sudo[153533]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:02 compute-0 sudo[153690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmwpnmuxeabyjsbbmbnkosfcenjkwjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137982.1916587-992-154779116575718/AnsiballZ_copy.py'
Feb 26 20:33:02 compute-0 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 26 20:33:02 compute-0 sudo[153690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:02 compute-0 python3.9[153693]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:02 compute-0 sudo[153690]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:03 compute-0 sudo[153843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matqbomqrfmsmwmlisebqjbcppeecvto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137982.8717635-992-140712167591233/AnsiballZ_copy.py'
Feb 26 20:33:03 compute-0 sudo[153843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:03 compute-0 python3.9[153846]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:03 compute-0 sudo[153843]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:03 compute-0 sudo[153996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhgfinrfzwpblbyhezhszamxurdcipsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137983.5003922-992-104568521006596/AnsiballZ_copy.py'
Feb 26 20:33:03 compute-0 sudo[153996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:04 compute-0 python3.9[153999]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:04 compute-0 sudo[153996]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:04 compute-0 sudo[154149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiyumrbuzqjiphzwgmnsferyayfkqzlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137984.196417-992-39425629716080/AnsiballZ_copy.py'
Feb 26 20:33:04 compute-0 sudo[154149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:04 compute-0 python3.9[154152]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:04 compute-0 sudo[154149]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:05 compute-0 sudo[154302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgcipvokgsiafdeibajumlkarjdepywl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137984.8404753-992-223661352317389/AnsiballZ_copy.py'
Feb 26 20:33:05 compute-0 sudo[154302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:05 compute-0 python3.9[154305]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:05 compute-0 sudo[154302]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:05 compute-0 sudo[154455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhthlwoeoojwvonldbxkvppjbaeizxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137985.498173-1028-131017049735587/AnsiballZ_copy.py'
Feb 26 20:33:05 compute-0 sudo[154455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:05 compute-0 python3.9[154458]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:05 compute-0 sudo[154455]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:06 compute-0 sudo[154608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilfdjihzgozulxtppopkmxfynkazdqdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137986.1094437-1028-122241487075971/AnsiballZ_copy.py'
Feb 26 20:33:06 compute-0 sudo[154608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:06 compute-0 python3.9[154611]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:06 compute-0 sudo[154608]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:06 compute-0 sudo[154761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxsvjsnbflldoebsytfzezmhpxglxgec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137986.6963973-1028-81776776106449/AnsiballZ_copy.py'
Feb 26 20:33:06 compute-0 sudo[154761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:07 compute-0 python3.9[154764]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:07 compute-0 sudo[154761]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:07 compute-0 sudo[154914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylveesdkdgegtubdygvgyicgzfettufk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137987.300815-1028-172342262360212/AnsiballZ_copy.py'
Feb 26 20:33:07 compute-0 sudo[154914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:07 compute-0 python3.9[154917]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:07 compute-0 sudo[154914]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:08 compute-0 sudo[155067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgiiauxesgyctlrzjuujxysfubqexldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137987.9576766-1028-202606783873231/AnsiballZ_copy.py'
Feb 26 20:33:08 compute-0 sudo[155067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:08 compute-0 python3.9[155070]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:08 compute-0 sudo[155067]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:08 compute-0 sudo[155220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozqzvnmczihorpomvazuompqslukofah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137988.5496492-1064-47210270989858/AnsiballZ_systemd.py'
Feb 26 20:33:08 compute-0 sudo[155220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:09 compute-0 python3.9[155223]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:33:09 compute-0 systemd[1]: Reloading.
Feb 26 20:33:09 compute-0 systemd-rc-local-generator[155243]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:09 compute-0 systemd-sysv-generator[155248]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:09 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 26 20:33:09 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 26 20:33:09 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 26 20:33:09 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 26 20:33:09 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 26 20:33:09 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 26 20:33:09 compute-0 sudo[155220]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:09 compute-0 sudo[155421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnxsymxyfgvxqktqvgniwtvqdkuqguby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137989.6389773-1064-133013646833315/AnsiballZ_systemd.py'
Feb 26 20:33:09 compute-0 sudo[155421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:10 compute-0 python3.9[155424]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:33:10 compute-0 systemd[1]: Reloading.
Feb 26 20:33:10 compute-0 systemd-rc-local-generator[155449]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:10 compute-0 systemd-sysv-generator[155455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:10 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 26 20:33:10 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 26 20:33:10 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 26 20:33:10 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 26 20:33:10 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 26 20:33:10 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 26 20:33:10 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 26 20:33:10 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 26 20:33:10 compute-0 sudo[155421]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:10 compute-0 sudo[155645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaymgphgifcrbxszbrywfpvyppsbywil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137990.646233-1064-45867247707259/AnsiballZ_systemd.py'
Feb 26 20:33:10 compute-0 sudo[155645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:11 compute-0 python3.9[155648]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:33:11 compute-0 systemd[1]: Reloading.
Feb 26 20:33:11 compute-0 systemd-rc-local-generator[155669]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:11 compute-0 systemd-sysv-generator[155672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:11 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 26 20:33:11 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 26 20:33:11 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 26 20:33:11 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 26 20:33:11 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 26 20:33:11 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 26 20:33:11 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 26 20:33:11 compute-0 sudo[155645]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:11 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 26 20:33:11 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 26 20:33:11 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 26 20:33:11 compute-0 sudo[155872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nojtgajjiqtoynyroqowwxhkmtjhjthi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137991.602304-1064-228243686617148/AnsiballZ_systemd.py'
Feb 26 20:33:11 compute-0 sudo[155872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:12 compute-0 python3.9[155875]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:33:12 compute-0 systemd[1]: Reloading.
Feb 26 20:33:12 compute-0 systemd-rc-local-generator[155903]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:12 compute-0 systemd-sysv-generator[155906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:12 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 26 20:33:12 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 26 20:33:12 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 26 20:33:12 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 26 20:33:12 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 26 20:33:12 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 26 20:33:12 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 26 20:33:12 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 26 20:33:12 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 26 20:33:12 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 26 20:33:12 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 26 20:33:12 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 26 20:33:12 compute-0 sudo[155872]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:12 compute-0 setroubleshoot[155692]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9ba8f028-db20-4ddc-baf2-6ff5cc0853db
Feb 26 20:33:12 compute-0 setroubleshoot[155692]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 26 20:33:12 compute-0 setroubleshoot[155692]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9ba8f028-db20-4ddc-baf2-6ff5cc0853db
Feb 26 20:33:12 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:33:12 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:33:12 compute-0 setroubleshoot[155692]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 26 20:33:12 compute-0 sudo[156099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kylojyvfgrntppikbcqmshfqqnhfhebi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137992.6735153-1064-142633373088048/AnsiballZ_systemd.py'
Feb 26 20:33:12 compute-0 sudo[156099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:13 compute-0 python3.9[156102]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:33:13 compute-0 systemd[1]: Reloading.
Feb 26 20:33:13 compute-0 systemd-rc-local-generator[156128]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:13 compute-0 systemd-sysv-generator[156131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:13 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 26 20:33:13 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 26 20:33:13 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 26 20:33:13 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 26 20:33:13 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 26 20:33:13 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 26 20:33:13 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 26 20:33:13 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 26 20:33:13 compute-0 sudo[156099]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:13 compute-0 sudo[156319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmlsftbzkktqnvoiprcpmmjdzalucaou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137993.7409532-1101-169347022241521/AnsiballZ_file.py'
Feb 26 20:33:13 compute-0 sudo[156319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:14 compute-0 python3.9[156322]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:14 compute-0 sudo[156319]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:14 compute-0 sudo[156472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxotrsfnitufsjexxcbrwgcgxdwinfmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137994.3000557-1109-130880467362192/AnsiballZ_find.py'
Feb 26 20:33:14 compute-0 sudo[156472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:14 compute-0 python3.9[156475]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:33:14 compute-0 sudo[156472]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:15 compute-0 sudo[156625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sidnbvldwhounsooyvethclwhcknliwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137995.0639005-1123-217239662689392/AnsiballZ_stat.py'
Feb 26 20:33:15 compute-0 sudo[156625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:15 compute-0 python3.9[156628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:15 compute-0 sudo[156625]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:15 compute-0 sudo[156749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iohujqjnanzhetvmtnsaxyjrxrdbtqey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137995.0639005-1123-217239662689392/AnsiballZ_copy.py'
Feb 26 20:33:15 compute-0 sudo[156749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:15 compute-0 python3.9[156752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772137995.0639005-1123-217239662689392/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:16 compute-0 sudo[156749]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:16 compute-0 sudo[156902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttndeejajzdotqxviqcogtcsvzrzhzts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137996.254188-1139-242752458477304/AnsiballZ_file.py'
Feb 26 20:33:16 compute-0 sudo[156902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:16 compute-0 python3.9[156905]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:16 compute-0 sudo[156902]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:17 compute-0 sudo[157055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksthgscumdnrjfxokpauiklcelsodcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137997.0110712-1147-67231794046165/AnsiballZ_stat.py'
Feb 26 20:33:17 compute-0 sudo[157055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:17 compute-0 python3.9[157058]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:17 compute-0 sudo[157055]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:17 compute-0 podman[157059]: 2026-02-26 20:33:17.557952786 +0000 UTC m=+0.081526660 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:33:17 compute-0 sudo[157154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iipjsfehzgxlnvfivjxbjzkwgmlhttet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137997.0110712-1147-67231794046165/AnsiballZ_file.py'
Feb 26 20:33:17 compute-0 sudo[157154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:17 compute-0 python3.9[157157]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:17 compute-0 sudo[157154]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:18 compute-0 sudo[157307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvcvqxpyzjumyaxjuunfgitxqrrerlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137998.1176562-1159-93573922187542/AnsiballZ_stat.py'
Feb 26 20:33:18 compute-0 sudo[157307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:18 compute-0 python3.9[157310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:18 compute-0 sudo[157307]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:18 compute-0 sudo[157386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snvatqstzxnvytibqhlqosomlddtipjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137998.1176562-1159-93573922187542/AnsiballZ_file.py'
Feb 26 20:33:18 compute-0 sudo[157386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:19 compute-0 python3.9[157389]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ycm0fnkx recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:19 compute-0 sudo[157386]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:19 compute-0 sudo[157539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzbqzdspkmjwwrotqnoxzinwdrjdsczr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137999.185391-1171-170990234012650/AnsiballZ_stat.py'
Feb 26 20:33:19 compute-0 sudo[157539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:19 compute-0 python3.9[157542]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:19 compute-0 sudo[157539]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:19 compute-0 sudo[157618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfdbeyepywrbkubzksuycnbmiluttdrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772137999.185391-1171-170990234012650/AnsiballZ_file.py'
Feb 26 20:33:19 compute-0 sudo[157618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:20 compute-0 python3.9[157621]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:20 compute-0 sudo[157618]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:20 compute-0 sudo[157771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enegldsldzdqwzacrlrfanpxlgxymtst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138000.329813-1184-146766236416018/AnsiballZ_command.py'
Feb 26 20:33:20 compute-0 sudo[157771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:20 compute-0 python3.9[157774]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:20 compute-0 sudo[157771]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:21 compute-0 sudo[157925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svootabordorilsslrkznsjddkxiapqa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138000.9787607-1192-73092193113556/AnsiballZ_edpm_nftables_from_files.py'
Feb 26 20:33:21 compute-0 sudo[157925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:21 compute-0 python3[157928]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 26 20:33:21 compute-0 sudo[157925]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:21 compute-0 sudo[158078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftlgexuephdstvnzmxaxbexksnmtoera ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138001.7110167-1200-78759843035560/AnsiballZ_stat.py'
Feb 26 20:33:22 compute-0 sudo[158078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:22 compute-0 python3.9[158081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:22 compute-0 sudo[158078]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:22 compute-0 sudo[158157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxhekmgyhwbobshljmvsosttxmmyvapx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138001.7110167-1200-78759843035560/AnsiballZ_file.py'
Feb 26 20:33:22 compute-0 sudo[158157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:22 compute-0 python3.9[158160]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:22 compute-0 sudo[158157]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:22 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 26 20:33:22 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 26 20:33:23 compute-0 sudo[158310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xevbxrbjwkkugavadwjygaqcmluyueee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138002.82613-1212-3781029158046/AnsiballZ_stat.py'
Feb 26 20:33:23 compute-0 sudo[158310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:23 compute-0 python3.9[158313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:23 compute-0 sudo[158310]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:23 compute-0 sudo[158436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwvpffrbbxsrrqkxgilonhrwxhleufgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138002.82613-1212-3781029158046/AnsiballZ_copy.py'
Feb 26 20:33:23 compute-0 sudo[158436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:23 compute-0 python3.9[158439]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772138002.82613-1212-3781029158046/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:23 compute-0 sudo[158436]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:24 compute-0 sudo[158589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hghaugmudgalftoyctngfkwitffrcfnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138004.0360684-1227-142817448994521/AnsiballZ_stat.py'
Feb 26 20:33:24 compute-0 sudo[158589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:24 compute-0 python3.9[158592]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:24 compute-0 sudo[158589]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:24 compute-0 sudo[158668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dineqijsemmgggxfhmcoycvibbjuhktw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138004.0360684-1227-142817448994521/AnsiballZ_file.py'
Feb 26 20:33:24 compute-0 sudo[158668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:24 compute-0 python3.9[158671]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:24 compute-0 sudo[158668]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:25 compute-0 sudo[158821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobxcyjfsepnsrvpqzvrcamquslbepwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138005.034293-1239-176853829312694/AnsiballZ_stat.py'
Feb 26 20:33:25 compute-0 sudo[158821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:25 compute-0 python3.9[158824]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:25 compute-0 sudo[158821]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:25 compute-0 podman[158825]: 2026-02-26 20:33:25.661350839 +0000 UTC m=+0.166108608 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 26 20:33:25 compute-0 sudo[158927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtgfxqkykpyhmtgtuvtimfxzibjoihib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138005.034293-1239-176853829312694/AnsiballZ_file.py'
Feb 26 20:33:25 compute-0 sudo[158927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:25 compute-0 python3.9[158930]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:25 compute-0 sudo[158927]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:26 compute-0 sudo[159080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbxezrcaodfgwfqwurxnviqhlgimdvpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138006.0666938-1251-207728619108161/AnsiballZ_stat.py'
Feb 26 20:33:26 compute-0 sudo[159080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:26 compute-0 python3.9[159083]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:26 compute-0 sudo[159080]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:26 compute-0 sudo[159206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxiuxjejarmojfihmzpeemqcpnyuuyxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138006.0666938-1251-207728619108161/AnsiballZ_copy.py'
Feb 26 20:33:26 compute-0 sudo[159206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:27 compute-0 python3.9[159209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772138006.0666938-1251-207728619108161/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:27 compute-0 sudo[159206]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:27 compute-0 sudo[159359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfomxdjoztwomjglwgwvqogkvliekzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138007.3548234-1266-128581784309213/AnsiballZ_file.py'
Feb 26 20:33:27 compute-0 sudo[159359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:27 compute-0 python3.9[159362]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:27 compute-0 sudo[159359]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:28 compute-0 sudo[159512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hciappmfmqvpttspvyeltmjqkslcnhhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138007.9759052-1274-143403661924855/AnsiballZ_command.py'
Feb 26 20:33:28 compute-0 sudo[159512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:28 compute-0 python3.9[159515]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:28 compute-0 sudo[159512]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:28 compute-0 sudo[159668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuiijwcgudoxyvmhofuxhkqumgnqflbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138008.558612-1282-151791422848225/AnsiballZ_blockinfile.py'
Feb 26 20:33:28 compute-0 sudo[159668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:29 compute-0 python3.9[159671]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:29 compute-0 sudo[159668]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:29 compute-0 sudo[159821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojqhedjjohcifkvkokhobbrzksjybkok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138009.4135327-1291-182426839570173/AnsiballZ_command.py'
Feb 26 20:33:29 compute-0 sudo[159821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:29 compute-0 python3.9[159824]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:29 compute-0 sudo[159821]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:30 compute-0 sudo[159975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrvpslnvxhpjfbfczhttxkufglkjdfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138010.0361981-1299-1035989969943/AnsiballZ_stat.py'
Feb 26 20:33:30 compute-0 sudo[159975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:30 compute-0 python3.9[159978]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:33:30 compute-0 sudo[159975]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:31 compute-0 sudo[160130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyrbrijogqvpfibbdoqhntezxohkyqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138010.808625-1307-160426908140978/AnsiballZ_command.py'
Feb 26 20:33:31 compute-0 sudo[160130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:31 compute-0 python3.9[160133]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:31 compute-0 sudo[160130]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:31 compute-0 sudo[160286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgexmwoppywacnimalwbdrhcvlqtnefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138011.3906934-1315-26516938828093/AnsiballZ_file.py'
Feb 26 20:33:31 compute-0 sudo[160286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:31 compute-0 python3.9[160289]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:31 compute-0 sudo[160286]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:32 compute-0 sudo[160439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hunplaiqwqosojlvrubnylmhshvbswrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138011.991787-1323-32422656885088/AnsiballZ_stat.py'
Feb 26 20:33:32 compute-0 sudo[160439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:32 compute-0 python3.9[160442]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:32 compute-0 sudo[160439]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:33 compute-0 sudo[160563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfauyxalgwruqoxfwbqxuddgbsevsfpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138011.991787-1323-32422656885088/AnsiballZ_copy.py'
Feb 26 20:33:33 compute-0 sudo[160563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:33 compute-0 python3.9[160566]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138011.991787-1323-32422656885088/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:33 compute-0 sudo[160563]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:33 compute-0 sudo[160716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuftmdvtsnvorejfonrqqabwlqqlpohs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138013.3855486-1338-185010756637309/AnsiballZ_stat.py'
Feb 26 20:33:33 compute-0 sudo[160716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:33 compute-0 python3.9[160719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:33 compute-0 sudo[160716]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:34 compute-0 sudo[160840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvbymlmtwcpdrhlbfudycfwsuyeiyjae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138013.3855486-1338-185010756637309/AnsiballZ_copy.py'
Feb 26 20:33:34 compute-0 sudo[160840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:34 compute-0 python3.9[160843]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138013.3855486-1338-185010756637309/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:34 compute-0 sudo[160840]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:34 compute-0 sudo[160993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khfinzrndgstcwghppegcbnjtrsqmqte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138014.4543746-1353-25222780523551/AnsiballZ_stat.py'
Feb 26 20:33:34 compute-0 sudo[160993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:34 compute-0 python3.9[160996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:34 compute-0 sudo[160993]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:35 compute-0 sudo[161117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdhgscmxplohthtdarzzhinbiroxmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138014.4543746-1353-25222780523551/AnsiballZ_copy.py'
Feb 26 20:33:35 compute-0 sudo[161117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:35 compute-0 python3.9[161120]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138014.4543746-1353-25222780523551/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:35 compute-0 sudo[161117]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:35 compute-0 sudo[161270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suvmnicwklkrenfdlbmjgchpowejhmis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138015.5855927-1368-259185943432948/AnsiballZ_systemd.py'
Feb 26 20:33:35 compute-0 sudo[161270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:36 compute-0 python3.9[161273]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:33:36 compute-0 systemd[1]: Reloading.
Feb 26 20:33:36 compute-0 systemd-sysv-generator[161308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:36 compute-0 systemd-rc-local-generator[161303]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:36 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 26 20:33:36 compute-0 sudo[161270]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:36 compute-0 sudo[161471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oobhrnvyvdvgvikrhxhweaequkdnqtar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138016.623029-1376-40855142329397/AnsiballZ_systemd.py'
Feb 26 20:33:36 compute-0 sudo[161471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:37 compute-0 python3.9[161474]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 26 20:33:37 compute-0 systemd[1]: Reloading.
Feb 26 20:33:37 compute-0 systemd-rc-local-generator[161499]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:37 compute-0 systemd-sysv-generator[161504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:37 compute-0 systemd[1]: Reloading.
Feb 26 20:33:37 compute-0 systemd-rc-local-generator[161541]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:33:37 compute-0 systemd-sysv-generator[161546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:33:37 compute-0 sudo[161471]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:38 compute-0 sshd-session[106490]: Connection closed by 192.168.122.30 port 37370
Feb 26 20:33:38 compute-0 sshd-session[106481]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:33:38 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 26 20:33:38 compute-0 systemd[1]: session-22.scope: Consumed 3min 5.747s CPU time.
Feb 26 20:33:38 compute-0 systemd-logind[825]: Session 22 logged out. Waiting for processes to exit.
Feb 26 20:33:38 compute-0 systemd-logind[825]: Removed session 22.
Feb 26 20:33:43 compute-0 sshd-session[161585]: Accepted publickey for zuul from 192.168.122.30 port 53652 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:33:43 compute-0 systemd-logind[825]: New session 23 of user zuul.
Feb 26 20:33:43 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 26 20:33:43 compute-0 sshd-session[161585]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:33:44 compute-0 python3.9[161738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:33:46 compute-0 python3.9[161892]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:33:46 compute-0 network[161909]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:33:46 compute-0 network[161910]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:33:46 compute-0 network[161911]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:33:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:33:46.493 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:33:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:33:46.495 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:33:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:33:46.495 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:33:47 compute-0 podman[161982]: 2026-02-26 20:33:47.6718118 +0000 UTC m=+0.065698302 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 26 20:33:48 compute-0 sudo[162201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdxizsqbmukcalunrtdgoqxfmmecsqwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138028.6649275-42-190041520259342/AnsiballZ_setup.py'
Feb 26 20:33:48 compute-0 sudo[162201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:49 compute-0 python3.9[162204]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 26 20:33:49 compute-0 sudo[162201]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:49 compute-0 sudo[162286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxceinbutpqszreonxavtbyofrbkkjnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138028.6649275-42-190041520259342/AnsiballZ_dnf.py'
Feb 26 20:33:49 compute-0 sudo[162286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:50 compute-0 python3.9[162289]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:33:55 compute-0 sudo[162286]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:55 compute-0 sudo[162456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twxmyeuxzkzsadrbbmvtyvrzglamanwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138035.5715573-54-16529585228816/AnsiballZ_stat.py'
Feb 26 20:33:55 compute-0 sudo[162456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:56 compute-0 podman[162414]: 2026-02-26 20:33:56.008812413 +0000 UTC m=+0.096099728 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 26 20:33:56 compute-0 python3.9[162464]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:33:56 compute-0 sudo[162456]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:56 compute-0 sudo[162619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlmoqyystapewzkxnyyajuxtsbtpaahs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138036.5134094-64-208543271314741/AnsiballZ_command.py'
Feb 26 20:33:56 compute-0 sudo[162619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:57 compute-0 python3.9[162622]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:57 compute-0 sudo[162619]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:57 compute-0 sudo[162773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smawxyizmkhappfkxprkpwjjholeelhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138037.4112306-74-107985637595740/AnsiballZ_stat.py'
Feb 26 20:33:57 compute-0 sudo[162773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:57 compute-0 python3.9[162776]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:33:57 compute-0 sudo[162773]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:58 compute-0 sudo[162926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxskkugqpavbwsjzjthgushkqwboukoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138038.0264711-82-171810205858788/AnsiballZ_command.py'
Feb 26 20:33:58 compute-0 sudo[162926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:58 compute-0 python3.9[162929]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:33:58 compute-0 sudo[162926]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:58 compute-0 sudo[163080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdiuqgcjjrlmpstzddnihxcvhzccxirc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138038.6697967-90-57335222520426/AnsiballZ_stat.py'
Feb 26 20:33:58 compute-0 sudo[163080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:59 compute-0 python3.9[163083]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:33:59 compute-0 sudo[163080]: pam_unix(sudo:session): session closed for user root
Feb 26 20:33:59 compute-0 sudo[163204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdvwyaxcepnzfbgjyrzevaxhvhmynkvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138038.6697967-90-57335222520426/AnsiballZ_copy.py'
Feb 26 20:33:59 compute-0 sudo[163204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:33:59 compute-0 python3.9[163207]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138038.6697967-90-57335222520426/.source.iscsi _original_basename=.u7_24d83 follow=False checksum=092977d9557e918ab5f1c760bc16aae1162bc842 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:33:59 compute-0 sudo[163204]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:00 compute-0 sudo[163357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxtwepstymkeibtnpooeucgoplghbtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138040.0988467-105-163701045074028/AnsiballZ_file.py'
Feb 26 20:34:00 compute-0 sudo[163357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:00 compute-0 python3.9[163360]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:00 compute-0 sudo[163357]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:01 compute-0 sudo[163510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlwdkgzyohrrvjelfbiqpthnpurzjcrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138040.962892-113-260139603810677/AnsiballZ_lineinfile.py'
Feb 26 20:34:01 compute-0 sudo[163510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:01 compute-0 python3.9[163513]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:01 compute-0 sudo[163510]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:02 compute-0 sudo[163663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzccdjywwldzzlnvlcsxoicxpzjmjlvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138041.8490617-122-152907499045115/AnsiballZ_systemd_service.py'
Feb 26 20:34:02 compute-0 sudo[163663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:02 compute-0 python3.9[163666]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:02 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 26 20:34:02 compute-0 sudo[163663]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:03 compute-0 sudo[163820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksoqcuicajylynorysszcnkvtnbzewlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138042.8390589-130-185316959958205/AnsiballZ_systemd_service.py'
Feb 26 20:34:03 compute-0 sudo[163820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:03 compute-0 python3.9[163823]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:03 compute-0 systemd[1]: Reloading.
Feb 26 20:34:03 compute-0 systemd-sysv-generator[163857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:03 compute-0 systemd-rc-local-generator[163854]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:03 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 26 20:34:03 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 26 20:34:03 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 26 20:34:03 compute-0 systemd[1]: Started Open-iSCSI.
Feb 26 20:34:03 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 26 20:34:03 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 26 20:34:03 compute-0 sudo[163820]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:05 compute-0 python3.9[164030]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:34:05 compute-0 network[164047]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:34:05 compute-0 network[164048]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:34:05 compute-0 network[164049]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:34:08 compute-0 sudo[164319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqdecljplkbnidckzgbmmngcmgmbxjcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138047.7437098-153-56077005618998/AnsiballZ_dnf.py'
Feb 26 20:34:08 compute-0 sudo[164319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:08 compute-0 python3.9[164322]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:34:10 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:34:10 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:34:10 compute-0 systemd[1]: Reloading.
Feb 26 20:34:10 compute-0 systemd-rc-local-generator[164360]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:10 compute-0 systemd-sysv-generator[164370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:34:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:34:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:34:11 compute-0 systemd[1]: run-ra913dc1c026343b8be0a86af535756c0.service: Deactivated successfully.
Feb 26 20:34:11 compute-0 sudo[164319]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:11 compute-0 sudo[164651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgewwjttyrpvqqdgnzvaugdphvbperjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138051.6266332-162-225118574511247/AnsiballZ_file.py'
Feb 26 20:34:11 compute-0 sudo[164651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:12 compute-0 python3.9[164654]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 26 20:34:12 compute-0 sudo[164651]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:12 compute-0 sudo[164804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-setrvlcdxywwljecnioufgmeqrcykgjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138052.3269825-170-105529550044769/AnsiballZ_modprobe.py'
Feb 26 20:34:12 compute-0 sudo[164804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:13 compute-0 python3.9[164807]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 26 20:34:13 compute-0 sudo[164804]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:13 compute-0 sudo[164961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tovxegtcntvzokituosxuyqbhaiiiazy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138053.193301-178-187711350108258/AnsiballZ_stat.py'
Feb 26 20:34:13 compute-0 sudo[164961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:13 compute-0 python3.9[164964]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:34:13 compute-0 sudo[164961]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:13 compute-0 sudo[165085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwtilesgejlmpsovqxchiznpkzayjru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138053.193301-178-187711350108258/AnsiballZ_copy.py'
Feb 26 20:34:13 compute-0 sudo[165085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:14 compute-0 python3.9[165088]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138053.193301-178-187711350108258/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:14 compute-0 sudo[165085]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:14 compute-0 sudo[165238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjbuobxdcfigmemdqgezpfpblwqtauxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138054.3850064-194-74479950402045/AnsiballZ_lineinfile.py'
Feb 26 20:34:14 compute-0 sudo[165238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:15 compute-0 python3.9[165241]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:15 compute-0 sudo[165238]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:15 compute-0 sudo[165391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yursvzqbvidtsebyjbnwwskulzhctmte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138055.1858554-202-190809767288543/AnsiballZ_systemd.py'
Feb 26 20:34:15 compute-0 sudo[165391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:16 compute-0 python3.9[165394]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:34:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 26 20:34:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 26 20:34:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 26 20:34:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 26 20:34:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 26 20:34:16 compute-0 sudo[165391]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:16 compute-0 sudo[165548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okjghgzflvxtwqltvmrnohpqfoiauzwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138056.2943912-210-69070760445006/AnsiballZ_command.py'
Feb 26 20:34:16 compute-0 sudo[165548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:16 compute-0 python3.9[165551]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:34:16 compute-0 sudo[165548]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:17 compute-0 sudo[165702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxlwhrvecipenqhckcbwboyurjfsdpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138057.0643437-220-157791168604489/AnsiballZ_stat.py'
Feb 26 20:34:17 compute-0 sudo[165702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:17 compute-0 python3.9[165705]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:34:17 compute-0 sudo[165702]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:17 compute-0 sudo[165866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrsyvikkqqrnqsqjxvfxmzbruulbiawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138057.6208959-229-105703331317127/AnsiballZ_stat.py'
Feb 26 20:34:17 compute-0 sudo[165866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:17 compute-0 podman[165829]: 2026-02-26 20:34:17.870844077 +0000 UTC m=+0.054661729 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 26 20:34:18 compute-0 python3.9[165877]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:34:18 compute-0 sudo[165866]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:18 compute-0 sudo[165999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vypcxbgmogktpquedivcunultrptybwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138057.6208959-229-105703331317127/AnsiballZ_copy.py'
Feb 26 20:34:18 compute-0 sudo[165999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:18 compute-0 python3.9[166002]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138057.6208959-229-105703331317127/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:18 compute-0 sudo[165999]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:19 compute-0 sudo[166152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyubxivxokamsrqcmendkkmirlyculut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138058.8199172-244-216188627794481/AnsiballZ_command.py'
Feb 26 20:34:19 compute-0 sudo[166152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:19 compute-0 python3.9[166155]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:34:19 compute-0 sudo[166152]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:19 compute-0 sudo[166306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjvgnhtderxyxgfbfchjvahguziigrlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138059.6366396-252-114793317817866/AnsiballZ_lineinfile.py'
Feb 26 20:34:19 compute-0 sudo[166306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:20 compute-0 python3.9[166309]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:20 compute-0 sudo[166306]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:20 compute-0 sudo[166459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihzdhofoajsfgbjdstbtipliwsueqfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138060.2007372-260-139640732073326/AnsiballZ_replace.py'
Feb 26 20:34:20 compute-0 sudo[166459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:20 compute-0 python3.9[166462]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:20 compute-0 sudo[166459]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:21 compute-0 sudo[166612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aipmledcyaodmewiqniyctarzoayqjyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138061.0520515-268-85780898994042/AnsiballZ_replace.py'
Feb 26 20:34:21 compute-0 sudo[166612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:21 compute-0 python3.9[166615]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:21 compute-0 sudo[166612]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:21 compute-0 sudo[166765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ettswsrljnmcinmnkofquffnqwusghtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138061.659723-277-229512737511204/AnsiballZ_lineinfile.py'
Feb 26 20:34:21 compute-0 sudo[166765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:22 compute-0 python3.9[166768]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:22 compute-0 sudo[166765]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:22 compute-0 sudo[166918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inibxzzgpwdlvlbuibuxkqzkafejefwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138062.2707958-277-177507816009748/AnsiballZ_lineinfile.py'
Feb 26 20:34:22 compute-0 sudo[166918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:22 compute-0 python3.9[166921]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:22 compute-0 sudo[166918]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:23 compute-0 sudo[167071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hihuzxtdrflmctkcylethldkwjmpyxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138063.0476396-277-41222116431850/AnsiballZ_lineinfile.py'
Feb 26 20:34:23 compute-0 sudo[167071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:23 compute-0 python3.9[167074]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:23 compute-0 sudo[167071]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:23 compute-0 sudo[167224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgwdzrltpmksmdwxvyamaoggkjbglhuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138063.6175976-277-104320010140679/AnsiballZ_lineinfile.py'
Feb 26 20:34:23 compute-0 sudo[167224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:24 compute-0 python3.9[167227]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:24 compute-0 sudo[167224]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:24 compute-0 sudo[167377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wryeuglancpltvbrjiijkvhivlfqdqaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138064.218989-306-72468093461789/AnsiballZ_stat.py'
Feb 26 20:34:24 compute-0 sudo[167377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:24 compute-0 python3.9[167380]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:34:24 compute-0 sudo[167377]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:25 compute-0 sudo[167532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfeiojtfovugdmcgvtxahcejojdlxlrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138064.8373928-314-6683006524370/AnsiballZ_command.py'
Feb 26 20:34:25 compute-0 sudo[167532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:25 compute-0 python3.9[167535]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:34:25 compute-0 sudo[167532]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:25 compute-0 sudo[167686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seyzxrghijdqhxgwneidsrdwerfnldud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138065.590115-323-11961605323957/AnsiballZ_systemd_service.py'
Feb 26 20:34:25 compute-0 sudo[167686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:26 compute-0 python3.9[167689]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:26 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 26 20:34:26 compute-0 sudo[167686]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:26 compute-0 podman[167691]: 2026-02-26 20:34:26.239818042 +0000 UTC m=+0.075174732 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:34:26 compute-0 sudo[167870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjloqgajoyoxozvwmrzajevfxcfhknnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138066.3502746-331-181208426454616/AnsiballZ_systemd_service.py'
Feb 26 20:34:26 compute-0 sudo[167870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:26 compute-0 python3.9[167873]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:26 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 26 20:34:26 compute-0 udevadm[167878]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 26 20:34:26 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 26 20:34:26 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 26 20:34:26 compute-0 multipathd[167882]: --------start up--------
Feb 26 20:34:26 compute-0 multipathd[167882]: read /etc/multipath.conf
Feb 26 20:34:26 compute-0 multipathd[167882]: path checkers start up
Feb 26 20:34:26 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 26 20:34:27 compute-0 sudo[167870]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:27 compute-0 sudo[168040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gergxbtkmhppbafoueimhyswbgyyjqoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138067.3646998-343-139091523971973/AnsiballZ_file.py'
Feb 26 20:34:27 compute-0 sudo[168040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:27 compute-0 python3.9[168043]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 26 20:34:27 compute-0 sudo[168040]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:28 compute-0 sudo[168193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feuslxmudyvxumjykgoqsuygfwqjzyqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138067.9245973-351-55892006279170/AnsiballZ_modprobe.py'
Feb 26 20:34:28 compute-0 sudo[168193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:28 compute-0 python3.9[168196]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 26 20:34:28 compute-0 kernel: Key type psk registered
Feb 26 20:34:28 compute-0 sudo[168193]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:28 compute-0 sudo[168356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woargcuxscximsznxmppjklgwnpaampn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138068.5009809-359-121747681019283/AnsiballZ_stat.py'
Feb 26 20:34:28 compute-0 sudo[168356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:29 compute-0 python3.9[168359]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:34:29 compute-0 sudo[168356]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:29 compute-0 sudo[168480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebdscxotiobduecgfssqlagqasisackr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138068.5009809-359-121747681019283/AnsiballZ_copy.py'
Feb 26 20:34:29 compute-0 sudo[168480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:29 compute-0 python3.9[168483]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138068.5009809-359-121747681019283/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:29 compute-0 sudo[168480]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:29 compute-0 sudo[168633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnyhiigiqocmpjkkkiwcoklvuummyoco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138069.7067432-375-160087358183344/AnsiballZ_lineinfile.py'
Feb 26 20:34:29 compute-0 sudo[168633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:30 compute-0 python3.9[168636]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:30 compute-0 sudo[168633]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:30 compute-0 sudo[168786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kokkiytrsxqxfegplvylrkrtifeavjfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138070.260513-383-221972693640423/AnsiballZ_systemd.py'
Feb 26 20:34:30 compute-0 sudo[168786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:30 compute-0 python3.9[168789]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:34:30 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 26 20:34:30 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 26 20:34:30 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 26 20:34:30 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 26 20:34:30 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 26 20:34:30 compute-0 sudo[168786]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:31 compute-0 sudo[168943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxiqkuyyabkkdkbihoegwnuuxswuexuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138071.0399451-391-82418886471770/AnsiballZ_dnf.py'
Feb 26 20:34:31 compute-0 sudo[168943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:31 compute-0 python3.9[168946]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 26 20:34:33 compute-0 systemd[1]: Reloading.
Feb 26 20:34:33 compute-0 systemd-rc-local-generator[168977]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:33 compute-0 systemd-sysv-generator[168982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:33 compute-0 systemd[1]: Reloading.
Feb 26 20:34:33 compute-0 systemd-rc-local-generator[169019]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:34 compute-0 systemd-sysv-generator[169024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:34 compute-0 systemd-logind[825]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 26 20:34:34 compute-0 systemd-logind[825]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 26 20:34:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 26 20:34:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 26 20:34:35 compute-0 systemd[1]: Reloading.
Feb 26 20:34:35 compute-0 systemd-sysv-generator[169130]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:35 compute-0 systemd-rc-local-generator[169125]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:35 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 26 20:34:35 compute-0 sudo[168943]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:36 compute-0 sudo[170439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odzuyffwcqmxhfvhjglrrznzccoaklyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138075.8364677-399-29256558359527/AnsiballZ_systemd_service.py'
Feb 26 20:34:36 compute-0 sudo[170439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:36 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 26 20:34:36 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 26 20:34:36 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.242s CPU time.
Feb 26 20:34:36 compute-0 systemd[1]: run-rb88a2519dfe64062bffe6cfd4f969b15.service: Deactivated successfully.
Feb 26 20:34:36 compute-0 python3.9[170442]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:34:36 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 26 20:34:36 compute-0 iscsid[163870]: iscsid shutting down.
Feb 26 20:34:36 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 26 20:34:36 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 26 20:34:36 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 26 20:34:36 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 26 20:34:36 compute-0 systemd[1]: Started Open-iSCSI.
Feb 26 20:34:36 compute-0 sudo[170439]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:36 compute-0 sudo[170597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxjgyfkgqepkqbaemztwmwzdiowlfkgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138076.6696982-407-275513707481302/AnsiballZ_systemd_service.py'
Feb 26 20:34:36 compute-0 sudo[170597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:37 compute-0 python3.9[170600]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:34:37 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 26 20:34:37 compute-0 multipathd[167882]: exit (signal)
Feb 26 20:34:37 compute-0 multipathd[167882]: --------shut down-------
Feb 26 20:34:37 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 26 20:34:37 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 26 20:34:37 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 26 20:34:37 compute-0 multipathd[170606]: --------start up--------
Feb 26 20:34:37 compute-0 multipathd[170606]: read /etc/multipath.conf
Feb 26 20:34:37 compute-0 multipathd[170606]: path checkers start up
Feb 26 20:34:37 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 26 20:34:37 compute-0 sudo[170597]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:37 compute-0 python3.9[170764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:34:38 compute-0 sudo[170918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybewhgmhvijtucccuviynixnqrqxlzfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138078.380835-425-14372538624392/AnsiballZ_file.py'
Feb 26 20:34:38 compute-0 sudo[170918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:38 compute-0 python3.9[170921]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:38 compute-0 sudo[170918]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:39 compute-0 sudo[171071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huwevxphemgqanayicmgvbunntjcpsfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138079.1125348-436-32025469119119/AnsiballZ_systemd_service.py'
Feb 26 20:34:39 compute-0 sudo[171071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:39 compute-0 python3.9[171074]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:34:39 compute-0 systemd[1]: Reloading.
Feb 26 20:34:39 compute-0 systemd-sysv-generator[171100]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:34:39 compute-0 systemd-rc-local-generator[171094]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:34:39 compute-0 sudo[171071]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:40 compute-0 python3.9[171265]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:34:40 compute-0 network[171282]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:34:40 compute-0 network[171283]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:34:40 compute-0 network[171284]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:34:43 compute-0 sudo[171555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pecblazuwdozkzftttiwrybuigezfgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138083.5413456-455-244702309805595/AnsiballZ_systemd_service.py'
Feb 26 20:34:43 compute-0 sudo[171555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:44 compute-0 python3.9[171558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:44 compute-0 sudo[171555]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:44 compute-0 sudo[171709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erhelkukpcazpakpetdfgkgffueicpkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138084.3376102-455-173386916051252/AnsiballZ_systemd_service.py'
Feb 26 20:34:44 compute-0 sudo[171709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:44 compute-0 python3.9[171712]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:44 compute-0 sudo[171709]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:45 compute-0 sudo[171863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cagzmsjgigzrntllndbyjpmdbcxvkvdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138084.976621-455-170458693296941/AnsiballZ_systemd_service.py'
Feb 26 20:34:45 compute-0 sudo[171863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:45 compute-0 python3.9[171866]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:45 compute-0 sudo[171863]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:45 compute-0 sudo[172017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-archqfabwbwkdlmtdlxidadxgvsajvub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138085.6237233-455-222354893862771/AnsiballZ_systemd_service.py'
Feb 26 20:34:45 compute-0 sudo[172017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:46 compute-0 python3.9[172020]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:46 compute-0 sudo[172017]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:34:46.496 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:34:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:34:46.498 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:34:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:34:46.498 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:34:46 compute-0 sudo[172171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxfvpbznyydknjwulnojcryrnfuucdbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138086.4470701-455-147432149729295/AnsiballZ_systemd_service.py'
Feb 26 20:34:46 compute-0 sudo[172171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:47 compute-0 python3.9[172174]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:47 compute-0 sudo[172171]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:47 compute-0 sudo[172325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbrvikhzgsdmiusvbtosdctaboeohcpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138087.1362374-455-139202416214970/AnsiballZ_systemd_service.py'
Feb 26 20:34:47 compute-0 sudo[172325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:47 compute-0 python3.9[172328]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:47 compute-0 sudo[172325]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:48 compute-0 sudo[172485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irbnotbqgruqhfxtvabiboqcgihxdpgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138087.8955746-455-6009678614904/AnsiballZ_systemd_service.py'
Feb 26 20:34:48 compute-0 sudo[172485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:48 compute-0 podman[172453]: 2026-02-26 20:34:48.206143005 +0000 UTC m=+0.067711357 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 26 20:34:48 compute-0 python3.9[172493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:48 compute-0 sudo[172485]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:48 compute-0 sudo[172654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynughqndfpczytqdveddyizkxjqrbaac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138088.6246295-455-194605026387403/AnsiballZ_systemd_service.py'
Feb 26 20:34:48 compute-0 sudo[172654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:49 compute-0 python3.9[172657]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:34:49 compute-0 sudo[172654]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:49 compute-0 sudo[172808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvltxxhtyimyzcdgyajdzyehjpvciped ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138089.5177135-514-71045604831276/AnsiballZ_file.py'
Feb 26 20:34:49 compute-0 sudo[172808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:49 compute-0 python3.9[172811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:49 compute-0 sudo[172808]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:50 compute-0 sudo[172961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siqbcvxqtbzppshrktacnwosqnkfnfbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138090.1172004-514-193939658032065/AnsiballZ_file.py'
Feb 26 20:34:50 compute-0 sudo[172961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:50 compute-0 python3.9[172964]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:50 compute-0 sudo[172961]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:51 compute-0 sudo[173114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyeitzxwovbnwkppktokrmyejhbwsjwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138090.749514-514-264892207036428/AnsiballZ_file.py'
Feb 26 20:34:51 compute-0 sudo[173114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:51 compute-0 python3.9[173117]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:51 compute-0 sudo[173114]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:51 compute-0 sudo[173267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eufupainkuyxyhqoyqimzjxhxunlmamu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138091.4355206-514-234387347821282/AnsiballZ_file.py'
Feb 26 20:34:51 compute-0 sudo[173267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:51 compute-0 python3.9[173270]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:51 compute-0 sudo[173267]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:52 compute-0 sudo[173420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntejfkkrwbonikvzqlxhcofcurnlawsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138092.0493388-514-261241194651403/AnsiballZ_file.py'
Feb 26 20:34:52 compute-0 sudo[173420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:52 compute-0 python3.9[173423]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:52 compute-0 sudo[173420]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:52 compute-0 sudo[173573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-creewjmwsdxdpqnaiiiwaaqktptqrxgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138092.562672-514-65274088365882/AnsiballZ_file.py'
Feb 26 20:34:52 compute-0 sudo[173573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:53 compute-0 python3.9[173576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:53 compute-0 sudo[173573]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:53 compute-0 sudo[173726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frchmfuptrdnbspvpegvdcjudapryfth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138093.1496027-514-269557763295161/AnsiballZ_file.py'
Feb 26 20:34:53 compute-0 sudo[173726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:53 compute-0 python3.9[173729]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:53 compute-0 sudo[173726]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:53 compute-0 sudo[173879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iavqgmvgcthzkuysiikbvwudavsnfnjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138093.6270895-514-15479892141111/AnsiballZ_file.py'
Feb 26 20:34:53 compute-0 sudo[173879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:54 compute-0 python3.9[173882]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:54 compute-0 sudo[173879]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:54 compute-0 sudo[174032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgtmmetjebcoyhozabgcwsrrjvtfvwah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138094.244329-571-226449320719716/AnsiballZ_file.py'
Feb 26 20:34:54 compute-0 sudo[174032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:54 compute-0 python3.9[174035]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:54 compute-0 sudo[174032]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:55 compute-0 sudo[174185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeckbuqyoygattnqxyzsepxhytvljwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138094.8118887-571-198579945973193/AnsiballZ_file.py'
Feb 26 20:34:55 compute-0 sudo[174185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:55 compute-0 python3.9[174188]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:55 compute-0 sudo[174185]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:55 compute-0 sudo[174338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovdeywprymrgngfsnvpkvqxorrwvduyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138095.4833825-571-253953751883803/AnsiballZ_file.py'
Feb 26 20:34:55 compute-0 sudo[174338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:55 compute-0 python3.9[174341]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:55 compute-0 sudo[174338]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:56 compute-0 sudo[174491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lomrnrxxrsembskexemxbxvhikrpbkll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138095.989493-571-246005696960822/AnsiballZ_file.py'
Feb 26 20:34:56 compute-0 sudo[174491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:56 compute-0 python3.9[174494]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:56 compute-0 sudo[174491]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:56 compute-0 podman[174495]: 2026-02-26 20:34:56.512257005 +0000 UTC m=+0.076751913 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, tcib_managed=true)
Feb 26 20:34:56 compute-0 sudo[174670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muccvwxxfxhdpgtdttrrfhqbjsqbrjsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138096.5371811-571-144990010804800/AnsiballZ_file.py'
Feb 26 20:34:56 compute-0 sudo[174670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:56 compute-0 python3.9[174673]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:56 compute-0 sudo[174670]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:57 compute-0 sudo[174823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpawiddtafzubabhfblzjcaueemfzwtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138097.0470774-571-130021173212084/AnsiballZ_file.py'
Feb 26 20:34:57 compute-0 sudo[174823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:57 compute-0 python3.9[174826]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:57 compute-0 sudo[174823]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:57 compute-0 sudo[174976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doqjswdpgifsfoxjmxpintmgsafhvzgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138097.6284113-571-68419840989940/AnsiballZ_file.py'
Feb 26 20:34:57 compute-0 sudo[174976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:58 compute-0 python3.9[174979]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:58 compute-0 sudo[174976]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:58 compute-0 sudo[175129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywlmnpymbeqdnltaakavrcdzblzruhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138098.1993034-571-82734015470633/AnsiballZ_file.py'
Feb 26 20:34:58 compute-0 sudo[175129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:58 compute-0 python3.9[175132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:34:58 compute-0 sudo[175129]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:59 compute-0 sudo[175282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrcssavdergbzxjicvcguospoicjhbet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138098.9185665-629-275349689012926/AnsiballZ_command.py'
Feb 26 20:34:59 compute-0 sudo[175282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:34:59 compute-0 python3.9[175285]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:34:59 compute-0 sudo[175282]: pam_unix(sudo:session): session closed for user root
Feb 26 20:34:59 compute-0 python3.9[175437]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:35:00 compute-0 sudo[175587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkavwcxpsumpghhjhinyrzgipzdgfas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138100.1475706-647-136779232437418/AnsiballZ_systemd_service.py'
Feb 26 20:35:00 compute-0 sudo[175587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:00 compute-0 python3.9[175590]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:35:00 compute-0 systemd[1]: Reloading.
Feb 26 20:35:00 compute-0 systemd-rc-local-generator[175609]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:35:00 compute-0 systemd-sysv-generator[175616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:35:01 compute-0 sudo[175587]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:01 compute-0 sudo[175784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjpuhvdoancvxpaurairliggmdepqbow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138101.14749-655-124137981711631/AnsiballZ_command.py'
Feb 26 20:35:01 compute-0 sudo[175784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:01 compute-0 python3.9[175787]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:01 compute-0 sudo[175784]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:01 compute-0 sudo[175938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkogqtprajimsbuuveqtqmtetiilbab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138101.7004335-655-104829577077935/AnsiballZ_command.py'
Feb 26 20:35:01 compute-0 sudo[175938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:02 compute-0 python3.9[175941]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:02 compute-0 sudo[175938]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:02 compute-0 sudo[176092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjikzkdcusoiqqbllalemdzrmhawlbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138102.3770096-655-216226728426827/AnsiballZ_command.py'
Feb 26 20:35:02 compute-0 sudo[176092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:02 compute-0 python3.9[176095]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:02 compute-0 sudo[176092]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:03 compute-0 sudo[176246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojoxkhfemlmnqfaytqzhntppiasgcxfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138102.976869-655-261603740916473/AnsiballZ_command.py'
Feb 26 20:35:03 compute-0 sudo[176246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:03 compute-0 python3.9[176249]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:03 compute-0 sudo[176246]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:03 compute-0 sudo[176400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fveoljiyzohhzryeqgzklrvpirtlotdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138103.6016927-655-209979706737232/AnsiballZ_command.py'
Feb 26 20:35:03 compute-0 sudo[176400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:04 compute-0 python3.9[176403]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:04 compute-0 sudo[176400]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:04 compute-0 sudo[176554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsrhrkiivpbdtukdcxkuibrmnxcclshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138104.3133428-655-216113361023394/AnsiballZ_command.py'
Feb 26 20:35:04 compute-0 sudo[176554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:04 compute-0 python3.9[176557]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:04 compute-0 sudo[176554]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:05 compute-0 sudo[176708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubheutouywiaqgytywrtoxnwtcwbpbli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138104.9565148-655-28129663315352/AnsiballZ_command.py'
Feb 26 20:35:05 compute-0 sudo[176708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:05 compute-0 python3.9[176711]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:05 compute-0 sudo[176708]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:05 compute-0 sudo[176862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdqrgagvprumryqziyzgdnflwieeyxvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138105.513022-655-219238153725994/AnsiballZ_command.py'
Feb 26 20:35:05 compute-0 sudo[176862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:05 compute-0 python3.9[176865]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:35:05 compute-0 sudo[176862]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:07 compute-0 sudo[177016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgvualbncxxpslxkfwcyxqwvqiwbrfxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138106.9587128-734-88881889830009/AnsiballZ_file.py'
Feb 26 20:35:07 compute-0 sudo[177016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:07 compute-0 python3.9[177019]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:07 compute-0 sudo[177016]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:07 compute-0 sudo[177169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkbujaigveuxbfbmbmvsadluazxklyzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138107.4624119-734-46310820473536/AnsiballZ_file.py'
Feb 26 20:35:07 compute-0 sudo[177169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:07 compute-0 python3.9[177172]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:07 compute-0 sudo[177169]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:08 compute-0 sudo[177322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnepihjixzgmfxteogoyykfnqqrhxrfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138108.0481832-749-196833482456552/AnsiballZ_file.py'
Feb 26 20:35:08 compute-0 sudo[177322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:08 compute-0 python3.9[177325]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:08 compute-0 sudo[177322]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:08 compute-0 sudo[177475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qusgrdepytqzjhqrcamhryxeivxpnuds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138108.6577563-749-205695453958393/AnsiballZ_file.py'
Feb 26 20:35:08 compute-0 sudo[177475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:09 compute-0 python3.9[177478]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:09 compute-0 sudo[177475]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:09 compute-0 sudo[177628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvvynuensanklkdnavalhmbdapigwor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138109.2063334-749-184942413130454/AnsiballZ_file.py'
Feb 26 20:35:09 compute-0 sudo[177628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:09 compute-0 python3.9[177631]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:09 compute-0 sudo[177628]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:09 compute-0 sudo[177781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odoholxxbxibbntbuadivdbwcsuazslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138109.7244105-749-101488525168528/AnsiballZ_file.py'
Feb 26 20:35:09 compute-0 sudo[177781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:10 compute-0 python3.9[177784]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:10 compute-0 sudo[177781]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:10 compute-0 sudo[177934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkufqhbeakoodeunihpvcjtnyaoqsvwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138110.193843-749-97686375955631/AnsiballZ_file.py'
Feb 26 20:35:10 compute-0 sudo[177934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:10 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 26 20:35:10 compute-0 python3.9[177937]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:10 compute-0 sudo[177934]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:10 compute-0 sudo[178088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exntknolkqttleeueikqsfalwmxmpgue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138110.7032616-749-88432904044913/AnsiballZ_file.py'
Feb 26 20:35:10 compute-0 sudo[178088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:11 compute-0 python3.9[178091]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:11 compute-0 sudo[178088]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:11 compute-0 sudo[178241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqgisxptxeqhdigsmhzxbzhmpxctpiga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138111.2396927-749-50926912962699/AnsiballZ_file.py'
Feb 26 20:35:11 compute-0 sudo[178241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:11 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 26 20:35:11 compute-0 python3.9[178244]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:11 compute-0 sudo[178241]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:12 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 26 20:35:13 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 26 20:35:17 compute-0 sudo[178397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgigvrbmtudjofegwchcslgoxsrxgwzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138116.7964292-958-63638009493714/AnsiballZ_getent.py'
Feb 26 20:35:17 compute-0 sudo[178397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:17 compute-0 python3.9[178400]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 26 20:35:17 compute-0 sudo[178397]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:18 compute-0 sudo[178551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeupsdmjbxnlndygxcweziguvtyihkrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138117.633727-966-198691917951792/AnsiballZ_group.py'
Feb 26 20:35:18 compute-0 sudo[178551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:18 compute-0 python3.9[178554]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:35:18 compute-0 groupadd[178556]: group added to /etc/group: name=nova, GID=42436
Feb 26 20:35:18 compute-0 groupadd[178556]: group added to /etc/gshadow: name=nova
Feb 26 20:35:18 compute-0 groupadd[178556]: new group: name=nova, GID=42436
Feb 26 20:35:18 compute-0 sudo[178551]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:18 compute-0 podman[178555]: 2026-02-26 20:35:18.414495937 +0000 UTC m=+0.081013934 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 26 20:35:19 compute-0 sudo[178729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfgrteluoedlwygypkmxpvuwvmotbjbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138118.5764124-974-226521799826418/AnsiballZ_user.py'
Feb 26 20:35:19 compute-0 sudo[178729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:19 compute-0 python3.9[178732]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 26 20:35:19 compute-0 useradd[178734]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 26 20:35:19 compute-0 useradd[178734]: add 'nova' to group 'libvirt'
Feb 26 20:35:19 compute-0 useradd[178734]: add 'nova' to shadow group 'libvirt'
Feb 26 20:35:19 compute-0 sudo[178729]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:20 compute-0 sshd-session[178765]: Accepted publickey for zuul from 192.168.122.30 port 32810 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:35:20 compute-0 systemd-logind[825]: New session 24 of user zuul.
Feb 26 20:35:20 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 26 20:35:20 compute-0 sshd-session[178765]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:35:20 compute-0 sshd-session[178768]: Received disconnect from 192.168.122.30 port 32810:11: disconnected by user
Feb 26 20:35:20 compute-0 sshd-session[178768]: Disconnected from user zuul 192.168.122.30 port 32810
Feb 26 20:35:20 compute-0 sshd-session[178765]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:35:20 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 26 20:35:20 compute-0 systemd-logind[825]: Session 24 logged out. Waiting for processes to exit.
Feb 26 20:35:20 compute-0 systemd-logind[825]: Removed session 24.
Feb 26 20:35:20 compute-0 python3.9[178918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:21 compute-0 python3.9[178994]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:21 compute-0 python3.9[179144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:22 compute-0 python3.9[179265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138121.3473182-999-71687381231878/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:22 compute-0 python3.9[179415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:23 compute-0 python3.9[179536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138122.5778604-999-120251984776835/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:23 compute-0 python3.9[179686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:24 compute-0 python3.9[179807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138123.4656851-999-73805297458468/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:24 compute-0 python3.9[179957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:25 compute-0 python3.9[180078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138124.468353-1053-62279243405711/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:25 compute-0 sudo[180228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdsyhvyiwmfgjmvfacohcgyuuveehrqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138125.6493828-1068-206843932084539/AnsiballZ_file.py'
Feb 26 20:35:25 compute-0 sudo[180228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:26 compute-0 python3.9[180231]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:26 compute-0 sudo[180228]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:26 compute-0 sudo[180381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bimoisvrlwazaodnhadjkrpaprfblogd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138126.1991959-1076-111090052356830/AnsiballZ_copy.py'
Feb 26 20:35:26 compute-0 sudo[180381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:26 compute-0 python3.9[180384]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:26 compute-0 sudo[180381]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:26 compute-0 podman[180385]: 2026-02-26 20:35:26.721589262 +0000 UTC m=+0.093097009 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 26 20:35:27 compute-0 sudo[180560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgfclpqhfkmynfsmygxrpnxpjrxqvmei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138126.778563-1084-138027850996788/AnsiballZ_stat.py'
Feb 26 20:35:27 compute-0 sudo[180560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:27 compute-0 python3.9[180563]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:27 compute-0 sudo[180560]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:27 compute-0 sudo[180713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjhbfzrwcoysmkdzdllhyqzvgyamshau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138127.3468647-1092-276785622902539/AnsiballZ_stat.py'
Feb 26 20:35:27 compute-0 sudo[180713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:27 compute-0 python3.9[180716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:27 compute-0 sudo[180713]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:28 compute-0 sudo[180837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbjmnjiqrsapahifdrypvwrkqhmhwjwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138127.3468647-1092-276785622902539/AnsiballZ_copy.py'
Feb 26 20:35:28 compute-0 sudo[180837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:28 compute-0 python3.9[180840]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1772138127.3468647-1092-276785622902539/.source _original_basename=.2lkstmmm follow=False checksum=6e33eb8c0e674348c8c2f5faae11b41cfaf26c31 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 26 20:35:28 compute-0 sudo[180837]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:28 compute-0 python3.9[180992]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:29 compute-0 sudo[181144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpqeelgtkhxidqmeritbwlijxivbamvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138129.2745044-1120-223021912657662/AnsiballZ_file.py'
Feb 26 20:35:29 compute-0 sudo[181144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:29 compute-0 python3.9[181147]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:29 compute-0 sudo[181144]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:30 compute-0 sudo[181297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkpkzatuhqbqpkmfmxepyjeqvhobjdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138129.8217645-1128-180999362170214/AnsiballZ_file.py'
Feb 26 20:35:30 compute-0 sudo[181297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:30 compute-0 python3.9[181300]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:30 compute-0 sudo[181297]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:30 compute-0 python3.9[181450]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:32 compute-0 sudo[181871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhcogtbnejrtyjftlordcnjozvkhbdhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138132.1894083-1162-52789458417202/AnsiballZ_container_config_data.py'
Feb 26 20:35:32 compute-0 sudo[181871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:32 compute-0 python3.9[181874]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 26 20:35:32 compute-0 sudo[181871]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:33 compute-0 sudo[182024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zredauabiqqfmhjqixkedddzmlpffkii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138133.2302616-1173-134392316758530/AnsiballZ_container_config_hash.py'
Feb 26 20:35:33 compute-0 sudo[182024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:33 compute-0 python3.9[182027]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:35:33 compute-0 sudo[182024]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:35 compute-0 sudo[182177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmoeelxkjylyjtbhacfzqjgnhwhkwfas ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138134.0384312-1183-82233222557959/AnsiballZ_edpm_container_manage.py'
Feb 26 20:35:35 compute-0 sudo[182177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:35 compute-0 python3[182180]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:35:35 compute-0 podman[182213]: 2026-02-26 20:35:35.45614339 +0000 UTC m=+0.045882771 container create 1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, container_name=nova_compute_init)
Feb 26 20:35:35 compute-0 podman[182213]: 2026-02-26 20:35:35.428512586 +0000 UTC m=+0.018251967 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 26 20:35:35 compute-0 python3[182180]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 26 20:35:35 compute-0 sudo[182177]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:35 compute-0 sudo[182401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixnqqvtyrchszjnsokcprjiofhitulcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138135.683958-1191-250269162618193/AnsiballZ_stat.py'
Feb 26 20:35:35 compute-0 sudo[182401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:36 compute-0 python3.9[182404]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:36 compute-0 sudo[182401]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:37 compute-0 python3.9[182556]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:35:37 compute-0 sudo[182706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjbatnwhmmitahzjplcwlmryqqrzfhlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138137.4700568-1218-248450524660526/AnsiballZ_stat.py'
Feb 26 20:35:37 compute-0 sudo[182706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:37 compute-0 python3.9[182709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:37 compute-0 sudo[182706]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:38 compute-0 sudo[182832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsvwlseeuapwcxgasksvhkhfbkjmtqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138137.4700568-1218-248450524660526/AnsiballZ_copy.py'
Feb 26 20:35:38 compute-0 sudo[182832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:38 compute-0 python3.9[182835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138137.4700568-1218-248450524660526/.source.yaml _original_basename=.fjerxoie follow=False checksum=4586f6877742f46a2abaaf1274137efd9a3cb19b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:38 compute-0 sudo[182832]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:38 compute-0 sudo[182985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twttawaddwvpfuhhxxzaqsxwcvvtaxxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138138.715439-1235-123011980172899/AnsiballZ_file.py'
Feb 26 20:35:38 compute-0 sudo[182985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:39 compute-0 python3.9[182988]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:39 compute-0 sudo[182985]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:39 compute-0 sudo[183138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axcbliuxzwmvajnhzoppcbepqnzlehvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138139.2924902-1243-239226518607011/AnsiballZ_file.py'
Feb 26 20:35:39 compute-0 sudo[183138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:39 compute-0 python3.9[183141]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:35:39 compute-0 sudo[183138]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:40 compute-0 sudo[183291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjusrrdixkpaohwgxlhbeqxwzaztalp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138139.8993156-1251-138533063396290/AnsiballZ_stat.py'
Feb 26 20:35:40 compute-0 sudo[183291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:40 compute-0 python3.9[183294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:40 compute-0 sudo[183291]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:40 compute-0 sudo[183415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqfupybtjehvxnyoynallbccsqerlrzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138139.8993156-1251-138533063396290/AnsiballZ_copy.py'
Feb 26 20:35:40 compute-0 sudo[183415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:40 compute-0 python3.9[183418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138139.8993156-1251-138533063396290/.source.json _original_basename=.uvuea2zv follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:40 compute-0 sudo[183415]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:41 compute-0 python3.9[183568]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:42 compute-0 sudo[183989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwxtsxgbdenbjjfxcajwtzicldlhqci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138142.464244-1291-106983663310585/AnsiballZ_container_config_data.py'
Feb 26 20:35:42 compute-0 sudo[183989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:42 compute-0 python3.9[183992]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 26 20:35:42 compute-0 sudo[183989]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:43 compute-0 sudo[184142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lljwoozidipzdqhonhvzokdguayestzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138143.2373755-1302-53859792577826/AnsiballZ_container_config_hash.py'
Feb 26 20:35:43 compute-0 sudo[184142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:43 compute-0 python3.9[184145]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:35:43 compute-0 sudo[184142]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:44 compute-0 sudo[184295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjkeebfjfgdgdlvyhrddmuvorjompqvd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138143.9889116-1312-232330892894507/AnsiballZ_edpm_container_manage.py'
Feb 26 20:35:44 compute-0 sudo[184295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:44 compute-0 python3[184298]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:35:44 compute-0 podman[184334]: 2026-02-26 20:35:44.678752059 +0000 UTC m=+0.059695861 container create d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_id=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 26 20:35:44 compute-0 podman[184334]: 2026-02-26 20:35:44.648439342 +0000 UTC m=+0.029383244 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 26 20:35:44 compute-0 python3[184298]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 26 20:35:45 compute-0 sudo[184295]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:46 compute-0 sudo[184523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjfolwxnstgplmfvilggoxusxwxwzxhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138145.9402058-1320-183404710264134/AnsiballZ_stat.py'
Feb 26 20:35:46 compute-0 sudo[184523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:46 compute-0 python3.9[184526]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:46 compute-0 sudo[184523]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:35:46.497 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:35:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:35:46.498 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:35:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:35:46.498 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:35:46 compute-0 sudo[184678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svryezgggztyldwwnmsucodyvmonymgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138146.576864-1329-256894823975548/AnsiballZ_file.py'
Feb 26 20:35:46 compute-0 sudo[184678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:47 compute-0 python3.9[184681]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:47 compute-0 sudo[184678]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:47 compute-0 sudo[184755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-begmnpiipowxhfslwlfnpwekxamrfqze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138146.576864-1329-256894823975548/AnsiballZ_stat.py'
Feb 26 20:35:47 compute-0 sudo[184755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:47 compute-0 python3.9[184758]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:47 compute-0 sudo[184755]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:47 compute-0 sudo[184907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egpqwptptqqxmfgadvykehavvdkcpekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138147.4298952-1329-216088920169605/AnsiballZ_copy.py'
Feb 26 20:35:47 compute-0 sudo[184907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:47 compute-0 python3.9[184910]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772138147.4298952-1329-216088920169605/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:47 compute-0 sudo[184907]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:48 compute-0 sudo[184984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbwxhvtywzzkxtsiplimccsvovafucca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138147.4298952-1329-216088920169605/AnsiballZ_systemd.py'
Feb 26 20:35:48 compute-0 sudo[184984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:48 compute-0 python3.9[184987]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:35:48 compute-0 systemd[1]: Reloading.
Feb 26 20:35:48 compute-0 podman[184988]: 2026-02-26 20:35:48.575668584 +0000 UTC m=+0.079867557 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223)
Feb 26 20:35:48 compute-0 systemd-rc-local-generator[185033]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:35:48 compute-0 systemd-sysv-generator[185036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:35:48 compute-0 sudo[184984]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:48 compute-0 sudo[185121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpztlfnhjgtqftkrojhdqggtjcjspzpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138147.4298952-1329-216088920169605/AnsiballZ_systemd.py'
Feb 26 20:35:48 compute-0 sudo[185121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:49 compute-0 python3.9[185124]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:35:49 compute-0 systemd[1]: Reloading.
Feb 26 20:35:49 compute-0 systemd-sysv-generator[185151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:35:49 compute-0 systemd-rc-local-generator[185148]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:35:49 compute-0 systemd[1]: Starting nova_compute container...
Feb 26 20:35:49 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:35:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:49 compute-0 podman[185171]: 2026-02-26 20:35:49.730541824 +0000 UTC m=+0.122315656 container init d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 26 20:35:49 compute-0 podman[185171]: 2026-02-26 20:35:49.743277272 +0000 UTC m=+0.135051114 container start d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 26 20:35:49 compute-0 podman[185171]: nova_compute
Feb 26 20:35:49 compute-0 systemd[1]: Started nova_compute container.
Feb 26 20:35:49 compute-0 nova_compute[185186]: + sudo -E kolla_set_configs
Feb 26 20:35:49 compute-0 sudo[185121]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Validating config file
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying service configuration files
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Deleting /etc/ceph
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Creating directory /etc/ceph
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /etc/ceph
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Writing out command to execute
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:49 compute-0 nova_compute[185186]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 26 20:35:49 compute-0 nova_compute[185186]: ++ cat /run_command
Feb 26 20:35:49 compute-0 nova_compute[185186]: + CMD=nova-compute
Feb 26 20:35:49 compute-0 nova_compute[185186]: + ARGS=
Feb 26 20:35:49 compute-0 nova_compute[185186]: + sudo kolla_copy_cacerts
Feb 26 20:35:49 compute-0 nova_compute[185186]: + [[ ! -n '' ]]
Feb 26 20:35:49 compute-0 nova_compute[185186]: + . kolla_extend_start
Feb 26 20:35:49 compute-0 nova_compute[185186]: Running command: 'nova-compute'
Feb 26 20:35:49 compute-0 nova_compute[185186]: + echo 'Running command: '\''nova-compute'\'''
Feb 26 20:35:49 compute-0 nova_compute[185186]: + umask 0022
Feb 26 20:35:49 compute-0 nova_compute[185186]: + exec nova-compute
Feb 26 20:35:50 compute-0 python3.9[185347]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:35:51 compute-0 sudo[185498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dygpoqtfkkrslstwdzxmvsuduzemnhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138150.9445286-1374-241414237647157/AnsiballZ_stat.py'
Feb 26 20:35:51 compute-0 sudo[185498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:51 compute-0 python3.9[185501]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:35:51 compute-0 sudo[185498]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.742 185190 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.743 185190 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.743 185190 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.743 185190 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 26 20:35:51 compute-0 sudo[185626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezvaglwsarmzgkvobmmnnhmskdwypdca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138150.9445286-1374-241414237647157/AnsiballZ_copy.py'
Feb 26 20:35:51 compute-0 sudo[185626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.925 185190 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.951 185190 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:35:51 compute-0 nova_compute[185186]: 2026-02-26 20:35:51.952 185190 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 26 20:35:52 compute-0 python3.9[185629]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138150.9445286-1374-241414237647157/.source.yaml _original_basename=.g7pd6ln9 follow=False checksum=53ae079634fcf765c977a1bdc654e2da49fbeef7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:35:52 compute-0 sudo[185626]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.511 185190 INFO nova.virt.driver [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.635 185190 INFO nova.compute.provider_config [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.648 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.649 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.649 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.650 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.651 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.652 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.653 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.654 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.655 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.656 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.656 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.656 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.656 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.656 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.657 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.658 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.658 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.658 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.658 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.658 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.659 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.660 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.661 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.662 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.663 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.664 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.665 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.665 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.665 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.665 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.665 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.666 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.666 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.666 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.667 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.668 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.669 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.670 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.671 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.672 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.673 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.674 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.675 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.676 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.677 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.678 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.679 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.680 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.681 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.682 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.683 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.684 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.685 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.686 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.687 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.688 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.689 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.690 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.691 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.692 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.693 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.694 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.694 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.694 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.694 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.694 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.695 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.696 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.697 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.698 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.699 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.700 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.701 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.702 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.703 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.704 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.704 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.704 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.704 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.704 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.705 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.706 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.707 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.708 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.709 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.710 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.711 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.711 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.711 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.711 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.711 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.712 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.712 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.712 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.712 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.712 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.713 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.714 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.714 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.714 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.714 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.714 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.715 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.716 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.717 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.717 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.717 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.717 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.717 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.718 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.718 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.718 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.718 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.718 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.719 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.720 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.720 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.720 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.720 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.721 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.721 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.721 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.721 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.721 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.722 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.722 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.722 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.722 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.723 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.723 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.723 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.723 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.723 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.724 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.724 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.724 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.724 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.724 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.725 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.725 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.725 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.725 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.725 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.726 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.726 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.726 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.726 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.726 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.727 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.728 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.728 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.728 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.728 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.728 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.729 185190 WARNING oslo_config.cfg [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 26 20:35:52 compute-0 nova_compute[185186]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 26 20:35:52 compute-0 nova_compute[185186]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 26 20:35:52 compute-0 nova_compute[185186]: and ``live_migration_inbound_addr`` respectively.
Feb 26 20:35:52 compute-0 nova_compute[185186]: ).  Its value may be silently ignored in the future.
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.729 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.729 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.729 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.729 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.730 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.731 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.731 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.731 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.731 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.732 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.732 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.732 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.732 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.732 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.733 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.734 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.734 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.734 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.734 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.734 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.735 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.736 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.737 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.738 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.739 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.740 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.741 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.742 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.743 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.744 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.745 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.746 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.747 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.748 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.749 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.750 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.751 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.751 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.751 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.751 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.751 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.752 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.753 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.754 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.754 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.754 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.754 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.754 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.755 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.756 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.757 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.758 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.759 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.760 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.761 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.762 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.763 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.764 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.765 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.766 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.767 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.768 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.769 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.770 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.771 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.772 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.772 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.772 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.772 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.772 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.773 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.774 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.775 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.776 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.777 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.778 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.779 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.780 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.781 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.782 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.783 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.784 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.785 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.786 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.787 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.788 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.789 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.790 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.791 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.792 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.793 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.793 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.793 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.793 185190 DEBUG oslo_service.service [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.794 185190 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.808 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.809 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.809 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.809 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 26 20:35:52 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 26 20:35:52 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 26 20:35:52 compute-0 python3.9[185781]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.888 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fefded55190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.890 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fefded55190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.891 185190 INFO nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Connection event '1' reason 'None'
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.915 185190 WARNING nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 26 20:35:52 compute-0 nova_compute[185186]: 2026-02-26 20:35:52.915 185190 DEBUG nova.virt.libvirt.volume.mount [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 26 20:35:53 compute-0 python3.9[185983]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:53 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.817 185190 INFO nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Libvirt host capabilities <capabilities>
Feb 26 20:35:53 compute-0 nova_compute[185186]: 
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <host>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <uuid>35e489ed-3c64-48cc-802f-42161f451b28</uuid>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <cpu>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <arch>x86_64</arch>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model>EPYC-Rome-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <vendor>AMD</vendor>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <microcode version='16777317'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <signature family='23' model='49' stepping='0'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='x2apic'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='tsc-deadline'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='osxsave'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='hypervisor'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='tsc_adjust'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='spec-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='stibp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='arch-capabilities'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='cmp_legacy'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='topoext'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='virt-ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='lbrv'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='tsc-scale'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='vmcb-clean'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='pause-filter'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='pfthreshold'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='svme-addr-chk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='rdctl-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='skip-l1dfl-vmentry'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='mds-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature name='pschange-mc-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <pages unit='KiB' size='4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <pages unit='KiB' size='2048'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <pages unit='KiB' size='1048576'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </cpu>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <power_management>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <suspend_mem/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <suspend_disk/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <suspend_hybrid/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </power_management>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <iommu support='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <migration_features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <live/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <uri_transports>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <uri_transport>tcp</uri_transport>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <uri_transport>rdma</uri_transport>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </uri_transports>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </migration_features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <topology>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <cells num='1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <cell id='0'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <memory unit='KiB'>7864276</memory>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <pages unit='KiB' size='4'>1966069</pages>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <pages unit='KiB' size='2048'>0</pages>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <distances>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <sibling id='0' value='10'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           </distances>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           <cpus num='8'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:           </cpus>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         </cell>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </cells>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </topology>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <cache>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </cache>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <secmodel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model>selinux</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <doi>0</doi>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </secmodel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <secmodel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model>dac</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <doi>0</doi>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </secmodel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </host>
Feb 26 20:35:53 compute-0 nova_compute[185186]: 
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <guest>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <os_type>hvm</os_type>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <arch name='i686'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <wordsize>32</wordsize>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <domain type='qemu'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <domain type='kvm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </arch>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <pae/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <nonpae/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <acpi default='on' toggle='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <apic default='on' toggle='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <cpuselection/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <deviceboot/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <disksnapshot default='on' toggle='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <externalSnapshot/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </guest>
Feb 26 20:35:53 compute-0 nova_compute[185186]: 
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <guest>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <os_type>hvm</os_type>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <arch name='x86_64'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <wordsize>64</wordsize>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <domain type='qemu'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <domain type='kvm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </arch>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <acpi default='on' toggle='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <apic default='on' toggle='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <cpuselection/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <deviceboot/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <disksnapshot default='on' toggle='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <externalSnapshot/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </guest>
Feb 26 20:35:53 compute-0 nova_compute[185186]: 
Feb 26 20:35:53 compute-0 nova_compute[185186]: </capabilities>
Feb 26 20:35:53 compute-0 nova_compute[185186]: 
Feb 26 20:35:53 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.826 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 26 20:35:53 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.846 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 26 20:35:53 compute-0 nova_compute[185186]: <domainCapabilities>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <domain>kvm</domain>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <arch>i686</arch>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <vcpu max='4096'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <iothreads supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <os supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <enum name='firmware'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <loader supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>rom</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pflash</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='readonly'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>yes</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='secure'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </loader>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </os>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <cpu>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='maximumMigratable'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <vendor>AMD</vendor>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='succor'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='custom' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='KnightsMill'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='athlon'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='athlon-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='core2duo'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='core2duo-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='coreduo'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='coreduo-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='n270'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='n270-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='phenom'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='phenom-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </cpu>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <memoryBacking supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <enum name='sourceType'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <value>file</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <value>anonymous</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <value>memfd</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </memoryBacking>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <devices>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <disk supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='diskDevice'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>disk</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>cdrom</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>floppy</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>lun</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>fdc</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>sata</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </disk>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <graphics supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vnc</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>egl-headless</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </graphics>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <video supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='modelType'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vga</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>cirrus</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>none</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>bochs</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>ramfb</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </video>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <hostdev supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='mode'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>subsystem</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='startupPolicy'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>mandatory</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>requisite</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>optional</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='subsysType'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pci</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='capsType'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='pciBackend'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </hostdev>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <rng supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>random</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>egd</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </rng>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <filesystem supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='driverType'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>path</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>handle</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>virtiofs</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </filesystem>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <tpm supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>tpm-tis</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>tpm-crb</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>emulator</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>external</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='backendVersion'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>2.0</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </tpm>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <redirdev supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </redirdev>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <channel supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </channel>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <crypto supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='model'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>qemu</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </crypto>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <interface supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='backendType'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>passt</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </interface>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <panic supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>isa</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>hyperv</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </panic>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <console supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>null</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vc</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>dev</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>file</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pipe</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>stdio</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>udp</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>tcp</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>qemu-vdagent</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </console>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </devices>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <features>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <gic supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <genid supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <backup supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <async-teardown supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <s390-pv supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <ps2 supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <tdx supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <sev supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <sgx supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <hyperv supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='features'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>relaxed</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vapic</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>spinlocks</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vpindex</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>runtime</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>synic</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>stimer</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>reset</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>vendor_id</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>frequencies</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>reenlightenment</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>tlbflush</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>ipi</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>avic</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>emsr_bitmap</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>xmm_input</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <defaults>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </defaults>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </hyperv>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <launchSecurity supported='no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </features>
Feb 26 20:35:53 compute-0 nova_compute[185186]: </domainCapabilities>
Feb 26 20:35:53 compute-0 nova_compute[185186]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:53 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.854 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 26 20:35:53 compute-0 nova_compute[185186]: <domainCapabilities>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <domain>kvm</domain>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <arch>i686</arch>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <vcpu max='240'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <iothreads supported='yes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <os supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <enum name='firmware'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <loader supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>rom</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>pflash</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='readonly'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>yes</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='secure'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </loader>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   </os>
Feb 26 20:35:53 compute-0 nova_compute[185186]:   <cpu>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <enum name='maximumMigratable'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <vendor>AMD</vendor>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='succor'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:53 compute-0 nova_compute[185186]:     <mode name='custom' supported='yes'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Denverton-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='EPYC-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Haswell-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='KnightsMill'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:53 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:53 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </cpu>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <memoryBacking supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <enum name='sourceType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>anonymous</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>memfd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </memoryBacking>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <disk supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='diskDevice'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>disk</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cdrom</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>floppy</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>lun</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ide</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>fdc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>sata</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </disk>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <graphics supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vnc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egl-headless</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </graphics>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <video supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='modelType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vga</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cirrus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>none</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>bochs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ramfb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </video>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hostdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='mode'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>subsystem</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='startupPolicy'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>mandatory</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>requisite</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>optional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='subsysType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pci</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='capsType'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='pciBackend'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hostdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <rng supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>random</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </rng>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <filesystem supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='driverType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>path</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>handle</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtiofs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </filesystem>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tpm supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-tis</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-crb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emulator</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>external</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendVersion'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>2.0</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </tpm>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <redirdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </redirdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <channel supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </channel>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <crypto supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </crypto>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <interface supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>passt</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </interface>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <panic supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>isa</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>hyperv</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </panic>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <console supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>null</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dev</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pipe</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stdio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>udp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tcp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu-vdagent</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </console>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <features>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <gic supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <genid supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backup supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <async-teardown supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <s390-pv supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <ps2 supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tdx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sev supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sgx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hyperv supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='features'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>relaxed</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vapic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>spinlocks</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vpindex</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>runtime</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>synic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stimer</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reset</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vendor_id</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>frequencies</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reenlightenment</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tlbflush</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ipi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>avic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emsr_bitmap</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>xmm_input</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hyperv>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <launchSecurity supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </features>
Feb 26 20:35:54 compute-0 nova_compute[185186]: </domainCapabilities>
Feb 26 20:35:54 compute-0 nova_compute[185186]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.926 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:53.931 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 26 20:35:54 compute-0 nova_compute[185186]: <domainCapabilities>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <domain>kvm</domain>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <arch>x86_64</arch>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <vcpu max='4096'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <iothreads supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <os supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <enum name='firmware'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>efi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <loader supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>rom</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pflash</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='readonly'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>yes</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='secure'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>yes</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </loader>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </os>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <cpu>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='maximumMigratable'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <vendor>AMD</vendor>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='succor'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='custom' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='KnightsMill'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </cpu>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <memoryBacking supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <enum name='sourceType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>anonymous</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>memfd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </memoryBacking>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <disk supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='diskDevice'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>disk</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cdrom</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>floppy</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>lun</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>fdc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>sata</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </disk>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <graphics supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vnc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egl-headless</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </graphics>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <video supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='modelType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vga</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cirrus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>none</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>bochs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ramfb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </video>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hostdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='mode'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>subsystem</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='startupPolicy'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>mandatory</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>requisite</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>optional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='subsysType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pci</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='capsType'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='pciBackend'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hostdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <rng supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>random</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </rng>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <filesystem supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='driverType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>path</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>handle</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtiofs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </filesystem>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tpm supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-tis</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-crb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emulator</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>external</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendVersion'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>2.0</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </tpm>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <redirdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </redirdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <channel supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </channel>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <crypto supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </crypto>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <interface supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>passt</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </interface>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <panic supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>isa</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>hyperv</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </panic>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <console supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>null</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dev</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pipe</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stdio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>udp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tcp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu-vdagent</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </console>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <features>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <gic supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <genid supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backup supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <async-teardown supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <s390-pv supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <ps2 supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tdx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sev supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sgx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hyperv supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='features'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>relaxed</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vapic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>spinlocks</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vpindex</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>runtime</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>synic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stimer</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reset</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vendor_id</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>frequencies</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reenlightenment</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tlbflush</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ipi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>avic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emsr_bitmap</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>xmm_input</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hyperv>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <launchSecurity supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </features>
Feb 26 20:35:54 compute-0 nova_compute[185186]: </domainCapabilities>
Feb 26 20:35:54 compute-0 nova_compute[185186]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.021 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 26 20:35:54 compute-0 nova_compute[185186]: <domainCapabilities>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <domain>kvm</domain>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <arch>x86_64</arch>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <vcpu max='240'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <iothreads supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <os supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <enum name='firmware'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <loader supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>rom</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pflash</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='readonly'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>yes</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='secure'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>no</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </loader>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </os>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <cpu>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='maximumMigratable'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>on</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>off</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <vendor>AMD</vendor>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='succor'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <mode name='custom' supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ddpd-u'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sha512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm3'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sm4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Denverton-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amd-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='auto-ibrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='perfmon-v2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbpb'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='stibp-always-on'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='EPYC-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-128'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-256'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx10-512'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='prefetchiti'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Haswell-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='KnightsMill'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512er'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512pf'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fma4'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tbm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xop'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='amx-tile'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-bf16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-fp16'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bitalg'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrc'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fzrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='la57'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='taa-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ifma'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cmpccxadd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fbsdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='fsrs'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ibrs-all'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='intel-psfd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='lam'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mcdt-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pbrsb-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='psdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='serialize'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vaes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='hle'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='rtm'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512bw'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512cd'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512dq'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512f'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='avx512vl'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='invpcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pcid'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='pku'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='mpx'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='core-capability'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='split-lock-detect'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='cldemote'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='erms'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='gfni'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdir64b'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='movdiri'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='xsaves'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='athlon-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='core2duo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='coreduo-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='n270-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='ss'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <blockers model='phenom-v1'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnow'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <feature name='3dnowext'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </blockers>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </mode>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </cpu>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <memoryBacking supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <enum name='sourceType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>anonymous</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <value>memfd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </memoryBacking>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <disk supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='diskDevice'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>disk</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cdrom</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>floppy</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>lun</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ide</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>fdc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>sata</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </disk>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <graphics supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vnc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egl-headless</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </graphics>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <video supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='modelType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vga</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>cirrus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>none</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>bochs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ramfb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </video>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hostdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='mode'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>subsystem</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='startupPolicy'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>mandatory</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>requisite</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>optional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='subsysType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pci</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>scsi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='capsType'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='pciBackend'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hostdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <rng supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtio-non-transitional</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>random</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>egd</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </rng>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <filesystem supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='driverType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>path</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>handle</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>virtiofs</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </filesystem>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tpm supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-tis</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tpm-crb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emulator</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>external</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendVersion'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>2.0</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </tpm>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <redirdev supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='bus'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>usb</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </redirdev>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <channel supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </channel>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <crypto supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendModel'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>builtin</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </crypto>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <interface supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='backendType'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>default</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>passt</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </interface>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <panic supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='model'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>isa</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>hyperv</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </panic>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <console supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='type'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>null</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vc</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pty</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dev</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>file</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>pipe</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stdio</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>udp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tcp</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>unix</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>qemu-vdagent</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>dbus</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </console>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </devices>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   <features>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <gic supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <genid supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <backup supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <async-teardown supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <s390-pv supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <ps2 supported='yes'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <tdx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sev supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <sgx supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <hyperv supported='yes'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <enum name='features'>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>relaxed</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vapic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>spinlocks</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vpindex</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>runtime</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>synic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>stimer</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reset</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>vendor_id</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>frequencies</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>reenlightenment</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>tlbflush</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>ipi</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>avic</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>emsr_bitmap</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <value>xmm_input</value>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </enum>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       <defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:54 compute-0 nova_compute[185186]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:54 compute-0 nova_compute[185186]:       </defaults>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     </hyperv>
Feb 26 20:35:54 compute-0 nova_compute[185186]:     <launchSecurity supported='no'/>
Feb 26 20:35:54 compute-0 nova_compute[185186]:   </features>
Feb 26 20:35:54 compute-0 nova_compute[185186]: </domainCapabilities>
Feb 26 20:35:54 compute-0 nova_compute[185186]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.078 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.078 185190 INFO nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Secure Boot support detected
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.081 185190 INFO nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.081 185190 INFO nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.090 185190 DEBUG nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.130 185190 INFO nova.virt.node [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Determined node identity 895ba9a7-707f-4e79-9130-ec9b9afa47ee from /var/lib/nova/compute_id
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.152 185190 WARNING nova.compute.manager [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Compute nodes ['895ba9a7-707f-4e79-9130-ec9b9afa47ee'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.190 185190 INFO nova.compute.manager [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 26 20:35:54 compute-0 python3.9[186145]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.223 185190 WARNING nova.compute.manager [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.223 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.223 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.223 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.224 185190 DEBUG nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:35:54 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 26 20:35:54 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.546 185190 WARNING nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.549 185190 DEBUG nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6077MB free_disk=72.97111511230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.549 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.549 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.575 185190 WARNING nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] No compute node record for compute-0.ctlplane.example.com:895ba9a7-707f-4e79-9130-ec9b9afa47ee: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 895ba9a7-707f-4e79-9130-ec9b9afa47ee could not be found.
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.594 185190 INFO nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 895ba9a7-707f-4e79-9130-ec9b9afa47ee
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.662 185190 DEBUG nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:35:54 compute-0 nova_compute[185186]: 2026-02-26 20:35:54.662 185190 DEBUG nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:35:55 compute-0 sudo[186318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdyjckpztsonzuupfucwwksxplxgrluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138154.4625008-1424-281388733984504/AnsiballZ_podman_container.py'
Feb 26 20:35:55 compute-0 sudo[186318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:55 compute-0 python3.9[186321]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 26 20:35:55 compute-0 sudo[186318]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:55 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.486 185190 INFO nova.scheduler.client.report [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] [req-3e3dfcef-dd04-41cd-8158-ecdafdfda1dc] Created resource provider record via placement API for resource provider with UUID 895ba9a7-707f-4e79-9130-ec9b9afa47ee and name compute-0.ctlplane.example.com.
Feb 26 20:35:55 compute-0 sudo[186494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikxntkmskprumxozzserjnmjwuxksqjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138155.534366-1432-119308326287405/AnsiballZ_systemd.py'
Feb 26 20:35:55 compute-0 sudo[186494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.841 185190 DEBUG nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 26 20:35:55 compute-0 nova_compute[185186]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.842 185190 INFO nova.virt.libvirt.host [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] kernel doesn't support AMD SEV
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.843 185190 DEBUG nova.compute.provider_tree [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.844 185190 DEBUG nova.virt.libvirt.driver [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.887 185190 DEBUG nova.scheduler.client.report [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Updated inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.888 185190 DEBUG nova.compute.provider_tree [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Updating resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.888 185190 DEBUG nova.compute.provider_tree [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.966 185190 DEBUG nova.compute.provider_tree [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Updating resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.988 185190 DEBUG nova.compute.resource_tracker [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.988 185190 DEBUG oslo_concurrency.lockutils [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:35:55 compute-0 nova_compute[185186]: 2026-02-26 20:35:55.989 185190 DEBUG nova.service [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 26 20:35:56 compute-0 python3.9[186497]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.080 185190 DEBUG nova.service [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.080 185190 DEBUG nova.servicegroup.drivers.db [None req-c906f636-e74e-45dd-9ccd-e119a9f0ed09 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 26 20:35:56 compute-0 sshd-session[186498]: Connection closed by 124.163.255.210 port 27159
Feb 26 20:35:56 compute-0 systemd[1]: Stopping nova_compute container...
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.223 185190 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.225 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.225 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:35:56 compute-0 nova_compute[185186]: 2026-02-26 20:35:56.226 185190 DEBUG oslo_concurrency.lockutils [None req-163974d4-bc64-4b2f-a09f-f553d3f5d7f9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:35:56 compute-0 systemd[1]: libpod-d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302.scope: Deactivated successfully.
Feb 26 20:35:56 compute-0 virtqemud[185803]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 26 20:35:56 compute-0 virtqemud[185803]: hostname: compute-0
Feb 26 20:35:56 compute-0 virtqemud[185803]: End of file while reading data: Input/output error
Feb 26 20:35:56 compute-0 systemd[1]: libpod-d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302.scope: Consumed 3.364s CPU time.
Feb 26 20:35:56 compute-0 podman[186502]: 2026-02-26 20:35:56.697154861 +0000 UTC m=+0.568075297 container died d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 26 20:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302-userdata-shm.mount: Deactivated successfully.
Feb 26 20:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d-merged.mount: Deactivated successfully.
Feb 26 20:35:56 compute-0 podman[186502]: 2026-02-26 20:35:56.749973427 +0000 UTC m=+0.620893863 container cleanup d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260223, config_id=nova_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 26 20:35:56 compute-0 podman[186502]: nova_compute
Feb 26 20:35:56 compute-0 podman[186539]: nova_compute
Feb 26 20:35:56 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 26 20:35:56 compute-0 systemd[1]: Stopped nova_compute container.
Feb 26 20:35:56 compute-0 systemd[1]: Starting nova_compute container...
Feb 26 20:35:56 compute-0 podman[186530]: 2026-02-26 20:35:56.845382196 +0000 UTC m=+0.090284175 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 26 20:35:56 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:35:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bc667c4a0be2b9d5355d7e355b71bad49b25090a3de15467d605e9477a0181d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:56 compute-0 podman[186566]: 2026-02-26 20:35:56.945807118 +0000 UTC m=+0.094782533 container init d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 26 20:35:56 compute-0 podman[186566]: 2026-02-26 20:35:56.950776859 +0000 UTC m=+0.099752254 container start d787abde11b09114c0e40368c3fdfb7adc9de4db5e3d16d167259bb8e3627302 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:35:56 compute-0 nova_compute[186588]: + sudo -E kolla_set_configs
Feb 26 20:35:56 compute-0 podman[186566]: nova_compute
Feb 26 20:35:56 compute-0 systemd[1]: Started nova_compute container.
Feb 26 20:35:56 compute-0 sudo[186494]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Validating config file
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying service configuration files
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /etc/ceph
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Creating directory /etc/ceph
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /etc/ceph
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Writing out command to execute
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:57 compute-0 nova_compute[186588]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 26 20:35:57 compute-0 nova_compute[186588]: ++ cat /run_command
Feb 26 20:35:57 compute-0 nova_compute[186588]: + CMD=nova-compute
Feb 26 20:35:57 compute-0 nova_compute[186588]: + ARGS=
Feb 26 20:35:57 compute-0 nova_compute[186588]: + sudo kolla_copy_cacerts
Feb 26 20:35:57 compute-0 nova_compute[186588]: + [[ ! -n '' ]]
Feb 26 20:35:57 compute-0 nova_compute[186588]: + . kolla_extend_start
Feb 26 20:35:57 compute-0 nova_compute[186588]: Running command: 'nova-compute'
Feb 26 20:35:57 compute-0 nova_compute[186588]: + echo 'Running command: '\''nova-compute'\'''
Feb 26 20:35:57 compute-0 nova_compute[186588]: + umask 0022
Feb 26 20:35:57 compute-0 nova_compute[186588]: + exec nova-compute
Feb 26 20:35:57 compute-0 sudo[186749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwsffobqzmjgrymswfzfeuayjfomyyoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138157.1732244-1441-277543854576713/AnsiballZ_podman_container.py'
Feb 26 20:35:57 compute-0 sudo[186749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:35:57 compute-0 python3.9[186752]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 26 20:35:57 compute-0 systemd[1]: Started libpod-conmon-1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60.scope.
Feb 26 20:35:57 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:35:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bc6a3139b9223a9fa0762e4385d552bc0790cb1bc67b67c7b13b710936ecd7a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bc6a3139b9223a9fa0762e4385d552bc0790cb1bc67b67c7b13b710936ecd7a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5bc6a3139b9223a9fa0762e4385d552bc0790cb1bc67b67c7b13b710936ecd7a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 26 20:35:57 compute-0 podman[186775]: 2026-02-26 20:35:57.887384303 +0000 UTC m=+0.128295345 container init 1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:35:57 compute-0 podman[186775]: 2026-02-26 20:35:57.893191007 +0000 UTC m=+0.134102029 container start 1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 26 20:35:57 compute-0 python3.9[186752]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Applying nova statedir ownership
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 26 20:35:57 compute-0 nova_compute_init[186796]: INFO:nova_statedir:Nova statedir ownership complete
Feb 26 20:35:57 compute-0 systemd[1]: libpod-1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60.scope: Deactivated successfully.
Feb 26 20:35:57 compute-0 podman[186810]: 2026-02-26 20:35:57.965426448 +0000 UTC m=+0.020715071 container died 1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 26 20:35:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60-userdata-shm.mount: Deactivated successfully.
Feb 26 20:35:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-5bc6a3139b9223a9fa0762e4385d552bc0790cb1bc67b67c7b13b710936ecd7a-merged.mount: Deactivated successfully.
Feb 26 20:35:57 compute-0 podman[186810]: 2026-02-26 20:35:57.994659597 +0000 UTC m=+0.049948200 container cleanup 1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'f92f499e3370a24be5567cedbf2a47e0ad5296d2be1ba7f5d33cc3181c0be47b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:35:58 compute-0 sudo[186749]: pam_unix(sudo:session): session closed for user root
Feb 26 20:35:58 compute-0 systemd[1]: libpod-conmon-1dd3df62a23925d2ed9c055a3e2b41fa5b2d0ec0eba9721ddabb48c8257b7a60.scope: Deactivated successfully.
Feb 26 20:35:58 compute-0 sshd-session[161588]: Connection closed by 192.168.122.30 port 53652
Feb 26 20:35:58 compute-0 sshd-session[161585]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:35:58 compute-0 systemd-logind[825]: Session 23 logged out. Waiting for processes to exit.
Feb 26 20:35:58 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 26 20:35:58 compute-0 systemd[1]: session-23.scope: Consumed 1min 31.717s CPU time.
Feb 26 20:35:58 compute-0 systemd-logind[825]: Removed session 23.
Feb 26 20:35:58 compute-0 nova_compute[186588]: 2026-02-26 20:35:58.870 186592 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:58 compute-0 nova_compute[186588]: 2026-02-26 20:35:58.870 186592 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:58 compute-0 nova_compute[186588]: 2026-02-26 20:35:58.871 186592 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 26 20:35:58 compute-0 nova_compute[186588]: 2026-02-26 20:35:58.871 186592 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.042 186592 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.049 186592 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.050 186592 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.455 186592 INFO nova.virt.driver [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.547 186592 INFO nova.compute.provider_config [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.559 186592 DEBUG oslo_concurrency.lockutils [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.559 186592 DEBUG oslo_concurrency.lockutils [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.560 186592 DEBUG oslo_concurrency.lockutils [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.560 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.560 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.560 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.560 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.561 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.561 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.561 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.561 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.561 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.562 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.562 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.562 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.562 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.562 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.563 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.563 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.563 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.563 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.563 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.564 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.564 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.564 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.564 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.564 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.565 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.565 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.565 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.565 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.565 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.566 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.566 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.566 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.566 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.566 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.567 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.567 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.567 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.567 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.567 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.568 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.568 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.568 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.568 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.568 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.569 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.569 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.569 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.569 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.569 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.570 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.570 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.570 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.570 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.570 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.571 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.571 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.571 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.571 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.571 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.572 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.573 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.573 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.573 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.573 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.573 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.574 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.574 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.574 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.574 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.574 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.575 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.575 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.575 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.575 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.575 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.576 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.577 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.577 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.577 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.577 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.577 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.578 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.578 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.578 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.578 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.578 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.579 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.579 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.579 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.579 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.579 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.580 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.581 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.581 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.581 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.581 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.581 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.582 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.582 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.582 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.582 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.582 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.583 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.583 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.583 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.583 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.583 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.584 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.584 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.584 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.584 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.584 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.585 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.586 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.586 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.586 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.586 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.587 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.587 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.587 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.587 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.587 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.588 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.589 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.589 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.589 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.589 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.590 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.590 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.590 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.590 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.590 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.591 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.591 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.591 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.591 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.591 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.592 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.592 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.592 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.592 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.592 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.593 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.593 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.593 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.593 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.593 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.594 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.594 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.594 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.594 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.594 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.595 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.595 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.595 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.595 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.595 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.596 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.596 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.596 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.596 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.596 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.597 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.597 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.597 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.597 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.597 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.598 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.598 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.598 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.598 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.598 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.599 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.599 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.599 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.599 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.599 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.600 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.600 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.600 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.600 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.600 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.601 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.601 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.601 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.601 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.601 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.602 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.602 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.602 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.602 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.602 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.603 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.603 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.603 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.603 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.603 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.604 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.605 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.605 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.605 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.605 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.605 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.606 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.606 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.606 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.606 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.607 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.607 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.607 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.607 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.608 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.608 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.608 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.608 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.609 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.609 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.610 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.610 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.610 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.610 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.610 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.611 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.611 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.611 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.611 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.611 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.612 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.612 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.612 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.612 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.612 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.613 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.614 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.614 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.614 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.614 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.614 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.615 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.615 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.615 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.615 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.615 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.616 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.617 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.617 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.617 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.617 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.617 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.618 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.618 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.618 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.618 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.618 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.619 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.620 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.620 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.620 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.620 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.620 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.621 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.621 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.621 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.621 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.621 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.622 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.622 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.622 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.622 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.622 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.623 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.623 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.623 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.623 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.623 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.624 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.624 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.624 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.624 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.624 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.625 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.625 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.625 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.625 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.625 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.626 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.626 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.626 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.626 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.626 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.627 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.628 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.628 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.628 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.628 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.628 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.629 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.629 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.629 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.629 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.629 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.630 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.630 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.630 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.630 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.631 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.632 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.632 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.632 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.632 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.632 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.633 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.634 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.634 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.634 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.634 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.634 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.635 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.636 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.636 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.636 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.636 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.636 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.637 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.637 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.637 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.637 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.637 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.638 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.639 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.639 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.639 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.639 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.639 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.640 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.641 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.641 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.641 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.641 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.641 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.642 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.643 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.643 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.643 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.643 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.643 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.644 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.644 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.644 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.644 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.644 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.645 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.646 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.646 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.646 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.646 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.646 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.647 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.648 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.648 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.648 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.648 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.648 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.649 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.650 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.650 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.650 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.650 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.650 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.651 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.651 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.651 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.651 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.651 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.652 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.653 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.653 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.653 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.653 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.653 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.654 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.655 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.655 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.655 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.655 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.656 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.656 186592 WARNING oslo_config.cfg [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 26 20:35:59 compute-0 nova_compute[186588]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 26 20:35:59 compute-0 nova_compute[186588]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 26 20:35:59 compute-0 nova_compute[186588]: and ``live_migration_inbound_addr`` respectively.
Feb 26 20:35:59 compute-0 nova_compute[186588]: ).  Its value may be silently ignored in the future.
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.656 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.656 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.656 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.657 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.657 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.657 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.657 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.657 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.658 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.659 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.659 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.659 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.659 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.659 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.660 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.660 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.660 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.660 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.660 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.661 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.661 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.661 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.661 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.661 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.662 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.663 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.663 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.663 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.663 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.663 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.664 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.665 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.665 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.665 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.665 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.665 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.666 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.666 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.666 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.666 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.666 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.667 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.668 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.668 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.668 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.668 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.668 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.669 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.670 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.670 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.670 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.670 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.670 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.671 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.672 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.672 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.672 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.672 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.672 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.673 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.674 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.674 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.674 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.674 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.674 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.675 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.676 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.676 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.676 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.676 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.676 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.677 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.678 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.678 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.678 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.678 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.678 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.679 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.680 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.680 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.680 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.680 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.680 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.681 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.681 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.681 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.681 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.681 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.682 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.682 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.682 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.682 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.682 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.683 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.684 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.684 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.684 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.684 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.684 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.685 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.685 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.685 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.685 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.685 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.686 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.687 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.687 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.687 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.687 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.687 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.688 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.688 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.688 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.688 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.688 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.689 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.689 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.689 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.689 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.689 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.690 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.691 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.692 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.692 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.692 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.692 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.692 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.693 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.693 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.693 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.693 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.693 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.694 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.695 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.695 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.695 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.695 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.695 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.696 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.697 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.697 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.697 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.697 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.697 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.698 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.699 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.699 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.699 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.699 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.699 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.700 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.701 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.701 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.701 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.701 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.701 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.702 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.702 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.702 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.702 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.702 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.703 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.703 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.703 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.703 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.703 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.704 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.705 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.705 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.705 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.705 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.705 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.706 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.706 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.706 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.706 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.706 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.707 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.708 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.708 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.708 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.708 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.709 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.710 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.710 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.710 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.710 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.710 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.711 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.711 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.711 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.711 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.711 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.712 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.713 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.713 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.713 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.713 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.713 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.714 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.714 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.714 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.714 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.714 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.715 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.716 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.716 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.716 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.716 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.716 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.717 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.717 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.717 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.717 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.717 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.718 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.718 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.718 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.718 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.719 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.719 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.719 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.720 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.720 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.720 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.721 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.721 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.721 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.722 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.722 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.722 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.723 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.724 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.725 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.726 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.727 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.728 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.729 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.730 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.731 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.732 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.733 186592 DEBUG oslo_service.service [None req-871b2661-ffbc-4eb2-aef8-7108cd29e8db - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.734 186592 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.746 186592 INFO nova.virt.node [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Determined node identity 895ba9a7-707f-4e79-9130-ec9b9afa47ee from /var/lib/nova/compute_id
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.747 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.747 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.748 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.748 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.756 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe9000d8250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.758 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe9000d8250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.759 186592 INFO nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Connection event '1' reason 'None'
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.764 186592 INFO nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Libvirt host capabilities <capabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]: 
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <host>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <uuid>35e489ed-3c64-48cc-802f-42161f451b28</uuid>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <arch>x86_64</arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model>EPYC-Rome-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <vendor>AMD</vendor>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <microcode version='16777317'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <signature family='23' model='49' stepping='0'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='x2apic'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='tsc-deadline'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='osxsave'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='hypervisor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='tsc_adjust'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='spec-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='stibp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='arch-capabilities'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='cmp_legacy'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='topoext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='virt-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='lbrv'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='tsc-scale'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='vmcb-clean'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='pause-filter'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='pfthreshold'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='svme-addr-chk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='rdctl-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='skip-l1dfl-vmentry'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='mds-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature name='pschange-mc-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <pages unit='KiB' size='4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <pages unit='KiB' size='2048'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <pages unit='KiB' size='1048576'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <power_management>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <suspend_mem/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <suspend_disk/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <suspend_hybrid/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </power_management>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <iommu support='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <migration_features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <live/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <uri_transports>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <uri_transport>tcp</uri_transport>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <uri_transport>rdma</uri_transport>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </uri_transports>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </migration_features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <topology>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <cells num='1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <cell id='0'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <memory unit='KiB'>7864276</memory>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <pages unit='KiB' size='4'>1966069</pages>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <pages unit='KiB' size='2048'>0</pages>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <distances>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <sibling id='0' value='10'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           </distances>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           <cpus num='8'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:           </cpus>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         </cell>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </cells>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </topology>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <cache>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </cache>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <secmodel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model>selinux</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <doi>0</doi>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </secmodel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <secmodel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model>dac</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <doi>0</doi>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </secmodel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </host>
Feb 26 20:35:59 compute-0 nova_compute[186588]: 
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <guest>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <os_type>hvm</os_type>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <arch name='i686'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <wordsize>32</wordsize>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <domain type='qemu'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <domain type='kvm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <pae/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <nonpae/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <acpi default='on' toggle='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <apic default='on' toggle='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <cpuselection/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <deviceboot/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <disksnapshot default='on' toggle='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <externalSnapshot/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </guest>
Feb 26 20:35:59 compute-0 nova_compute[186588]: 
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <guest>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <os_type>hvm</os_type>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <arch name='x86_64'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <wordsize>64</wordsize>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <domain type='qemu'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <domain type='kvm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <acpi default='on' toggle='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <apic default='on' toggle='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <cpuselection/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <deviceboot/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <disksnapshot default='on' toggle='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <externalSnapshot/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </guest>
Feb 26 20:35:59 compute-0 nova_compute[186588]: 
Feb 26 20:35:59 compute-0 nova_compute[186588]: </capabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]: 
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.770 186592 DEBUG nova.virt.libvirt.volume.mount [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.773 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.776 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 26 20:35:59 compute-0 nova_compute[186588]: <domainCapabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <domain>kvm</domain>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <arch>i686</arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <vcpu max='4096'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <iothreads supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <os supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <enum name='firmware'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <loader supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>rom</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pflash</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='readonly'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>yes</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='secure'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </loader>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </os>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='maximumMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <vendor>AMD</vendor>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='succor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='custom' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='KnightsMill'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='athlon'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='athlon-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='core2duo'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='core2duo-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='coreduo'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='coreduo-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='n270'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='n270-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='phenom'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='phenom-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <memoryBacking supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <enum name='sourceType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>file</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>anonymous</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>memfd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </memoryBacking>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <disk supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='diskDevice'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>disk</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>cdrom</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>floppy</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>lun</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>fdc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>sata</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <graphics supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vnc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>egl-headless</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </graphics>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <video supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='modelType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vga</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>cirrus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>none</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>bochs</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>ramfb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </video>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <hostdev supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='mode'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>subsystem</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='startupPolicy'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>mandatory</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>requisite</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>optional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='subsysType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pci</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='capsType'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='pciBackend'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </hostdev>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <rng supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>random</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>egd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <filesystem supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='driverType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>path</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>handle</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtiofs</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </filesystem>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <tpm supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tpm-tis</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tpm-crb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>emulator</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>external</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendVersion'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>2.0</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </tpm>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <redirdev supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </redirdev>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <channel supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </channel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <crypto supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>qemu</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </crypto>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <interface supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>passt</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <panic supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>isa</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>hyperv</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </panic>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <console supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>null</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dev</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>file</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pipe</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>stdio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>udp</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tcp</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>qemu-vdagent</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </console>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <gic supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <genid supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <backup supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <async-teardown supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <s390-pv supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <ps2 supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <tdx supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <sev supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <sgx supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <hyperv supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='features'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>relaxed</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vapic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>spinlocks</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vpindex</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>runtime</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>synic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>stimer</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>reset</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vendor_id</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>frequencies</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>reenlightenment</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tlbflush</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>ipi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>avic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>emsr_bitmap</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>xmm_input</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <defaults>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </defaults>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </hyperv>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <launchSecurity supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </features>
Feb 26 20:35:59 compute-0 nova_compute[186588]: </domainCapabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.782 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 26 20:35:59 compute-0 nova_compute[186588]: <domainCapabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <domain>kvm</domain>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <arch>i686</arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <vcpu max='240'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <iothreads supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <os supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <enum name='firmware'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <loader supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>rom</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pflash</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='readonly'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>yes</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='secure'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </loader>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </os>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='maximumMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <vendor>AMD</vendor>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='succor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='custom' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Haswell-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='KnightsMill'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='KnightsMill-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='athlon'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='athlon-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='core2duo'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='core2duo-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='coreduo'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='coreduo-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='n270'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='n270-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='phenom'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='phenom-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <memoryBacking supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <enum name='sourceType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>file</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>anonymous</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>memfd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </memoryBacking>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <disk supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='diskDevice'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>disk</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>cdrom</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>floppy</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>lun</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>ide</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>fdc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>sata</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <graphics supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vnc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>egl-headless</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </graphics>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <video supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='modelType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vga</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>cirrus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>none</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>bochs</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>ramfb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </video>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <hostdev supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='mode'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>subsystem</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='startupPolicy'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>mandatory</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>requisite</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>optional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='subsysType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pci</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='capsType'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='pciBackend'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </hostdev>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <rng supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>random</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>egd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <filesystem supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='driverType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>path</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>handle</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>virtiofs</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </filesystem>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <tpm supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tpm-tis</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tpm-crb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>emulator</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>external</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendVersion'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>2.0</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </tpm>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <redirdev supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </redirdev>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <channel supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </channel>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <crypto supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>qemu</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </crypto>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <interface supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='backendType'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>passt</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <panic supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>isa</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>hyperv</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </panic>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <console supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>null</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vc</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dev</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>file</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pipe</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>stdio</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>udp</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tcp</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>qemu-vdagent</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </console>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <features>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <gic supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <vmcoreinfo supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <genid supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <backingStoreInput supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <backup supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <async-teardown supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <s390-pv supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <ps2 supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <tdx supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <sev supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <sgx supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <hyperv supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='features'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>relaxed</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vapic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>spinlocks</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vpindex</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>runtime</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>synic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>stimer</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>reset</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>vendor_id</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>frequencies</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>reenlightenment</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>tlbflush</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>ipi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>avic</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>emsr_bitmap</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>xmm_input</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <defaults>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <spinlocks>4095</spinlocks>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <stimer_direct>on</stimer_direct>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </defaults>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </hyperv>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <launchSecurity supported='no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </features>
Feb 26 20:35:59 compute-0 nova_compute[186588]: </domainCapabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.829 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 26 20:35:59 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.833 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 26 20:35:59 compute-0 nova_compute[186588]: <domainCapabilities>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <domain>kvm</domain>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <arch>x86_64</arch>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <vcpu max='4096'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <iothreads supported='yes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <os supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <enum name='firmware'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>efi</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <loader supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>rom</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>pflash</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='readonly'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>yes</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='secure'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>yes</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </loader>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   </os>
Feb 26 20:35:59 compute-0 nova_compute[186588]:   <cpu>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='maximum' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <enum name='maximumMigratable'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='host-model' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <vendor>AMD</vendor>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='x2apic'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='stibp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='succor'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lbrv'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:35:59 compute-0 nova_compute[186588]:     <mode name='custom' supported='yes'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Denverton-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='Dhyana-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v3'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v4'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='EPYC-v5'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:35:59 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids'>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:35:59 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='KnightsMill'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='KnightsMill-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='athlon'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='athlon-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='core2duo'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='core2duo-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='coreduo'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='coreduo-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='n270'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='n270-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='phenom'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='phenom-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <memoryBacking supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <enum name='sourceType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>file</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>anonymous</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>memfd</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </memoryBacking>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <disk supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='diskDevice'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>disk</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>cdrom</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>floppy</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>lun</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>fdc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>sata</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <graphics supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vnc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>egl-headless</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </graphics>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <video supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='modelType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vga</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>cirrus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>none</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>bochs</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>ramfb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </video>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <hostdev supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='mode'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>subsystem</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='startupPolicy'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>mandatory</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>requisite</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>optional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='subsysType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pci</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='capsType'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='pciBackend'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </hostdev>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <rng supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>random</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>egd</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <filesystem supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='driverType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>path</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>handle</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtiofs</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </filesystem>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <tpm supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tpm-tis</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tpm-crb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>emulator</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>external</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendVersion'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>2.0</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </tpm>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <redirdev supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </redirdev>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <channel supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </channel>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <crypto supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>qemu</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </crypto>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <interface supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>passt</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <panic supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>isa</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>hyperv</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </panic>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <console supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>null</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dev</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>file</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pipe</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>stdio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>udp</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tcp</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>qemu-vdagent</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </console>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <features>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <gic supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <vmcoreinfo supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <genid supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <backingStoreInput supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <backup supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <async-teardown supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <s390-pv supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <ps2 supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <tdx supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <sev supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <sgx supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <hyperv supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='features'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>relaxed</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vapic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>spinlocks</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vpindex</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>runtime</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>synic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>stimer</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>reset</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vendor_id</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>frequencies</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>reenlightenment</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tlbflush</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>ipi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>avic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>emsr_bitmap</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>xmm_input</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <defaults>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <spinlocks>4095</spinlocks>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <stimer_direct>on</stimer_direct>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </defaults>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </hyperv>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <launchSecurity supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </features>
Feb 26 20:36:00 compute-0 nova_compute[186588]: </domainCapabilities>
Feb 26 20:36:00 compute-0 nova_compute[186588]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:35:59.952 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 26 20:36:00 compute-0 nova_compute[186588]: <domainCapabilities>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <path>/usr/libexec/qemu-kvm</path>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <domain>kvm</domain>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <arch>x86_64</arch>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <vcpu max='240'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <iothreads supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <os supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <enum name='firmware'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <loader supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>rom</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pflash</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='readonly'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>yes</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='secure'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>no</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </loader>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </os>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <cpu>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <mode name='host-passthrough' supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='hostPassthroughMigratable'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <mode name='maximum' supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='maximumMigratable'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>on</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>off</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <mode name='host-model' supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <vendor>AMD</vendor>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='x2apic'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-deadline'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='hypervisor'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc_adjust'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='spec-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='stibp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='ssbd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='cmp_legacy'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='overflow-recov'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='succor'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='amd-ssbd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='virt-ssbd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='lbrv'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='tsc-scale'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='vmcb-clean'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='flushbyasid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='pause-filter'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='pfthreshold'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='svme-addr-chk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <feature policy='disable' name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <mode name='custom' supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Broadwell-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cascadelake-Server-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='ClearwaterForest-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ddpd-u'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sha512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sm3'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sm4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cooperlake'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Cooperlake-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Denverton'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Denverton-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Denverton-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Denverton-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Dhyana-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Genoa-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Milan-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Rome-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-Turin-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amd-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='auto-ibrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vp2intersect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fs-gs-base-ns'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibpb-brtype'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='no-nested-data-bp'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='null-sel-clr-base'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='perfmon-v2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbpb'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='srso-user-kernel-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='stibp-always-on'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='EPYC-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='GraniteRapids-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-128'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-256'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx10-512'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='prefetchiti'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Haswell-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-noTSX'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v6'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Icelake-Server-v7'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='IvyBridge-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='KnightsMill'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='KnightsMill-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4fmaps'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-4vnniw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512er'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512pf'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G4-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Opteron_G5-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fma4'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tbm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xop'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SapphireRapids-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='amx-tile'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-bf16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-fp16'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512-vpopcntdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bitalg'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vbmi2'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrc'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fzrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='la57'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='taa-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='tsx-ldtrk'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='SierraForest-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ifma'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-ne-convert'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx-vnni-int8'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bhi-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='bus-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cmpccxadd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fbsdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='fsrs'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ibrs-all'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='intel-psfd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ipred-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='lam'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mcdt-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pbrsb-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='psdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rrsba-ctrl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='sbdr-ssdp-no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='serialize'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vaes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='vpclmulqdq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Client-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='hle'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='rtm'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Skylake-Server-v5'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512bw'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512cd'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512dq'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512f'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='avx512vl'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='invpcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pcid'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='pku'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='mpx'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v2'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v3'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='core-capability'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='split-lock-detect'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='Snowridge-v4'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='cldemote'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='erms'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='gfni'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdir64b'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='movdiri'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='xsaves'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='athlon'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='athlon-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='core2duo'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='core2duo-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='coreduo'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='coreduo-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='n270'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='n270-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='ss'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='phenom'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <blockers model='phenom-v1'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnow'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <feature name='3dnowext'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </blockers>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </mode>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <memoryBacking supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <enum name='sourceType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>file</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>anonymous</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <value>memfd</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </memoryBacking>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <disk supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='diskDevice'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>disk</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>cdrom</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>floppy</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>lun</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>ide</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>fdc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>sata</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <graphics supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vnc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>egl-headless</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </graphics>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <video supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='modelType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vga</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>cirrus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>none</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>bochs</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>ramfb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </video>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <hostdev supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='mode'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>subsystem</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='startupPolicy'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>mandatory</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>requisite</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>optional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='subsysType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pci</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>scsi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='capsType'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='pciBackend'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </hostdev>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <rng supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtio-non-transitional</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>random</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>egd</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <filesystem supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='driverType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>path</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>handle</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>virtiofs</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </filesystem>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <tpm supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tpm-tis</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tpm-crb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>emulator</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>external</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendVersion'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>2.0</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </tpm>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <redirdev supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='bus'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>usb</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </redirdev>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <channel supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </channel>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <crypto supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>qemu</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendModel'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>builtin</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </crypto>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <interface supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='backendType'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>default</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>passt</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <panic supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='model'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>isa</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>hyperv</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </panic>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <console supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='type'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>null</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vc</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pty</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dev</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>file</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>pipe</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>stdio</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>udp</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tcp</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>unix</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>qemu-vdagent</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>dbus</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </console>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   <features>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <gic supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <vmcoreinfo supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <genid supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <backingStoreInput supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <backup supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <async-teardown supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <s390-pv supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <ps2 supported='yes'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <tdx supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <sev supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <sgx supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <hyperv supported='yes'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <enum name='features'>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>relaxed</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vapic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>spinlocks</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vpindex</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>runtime</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>synic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>stimer</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>reset</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>vendor_id</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>frequencies</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>reenlightenment</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>tlbflush</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>ipi</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>avic</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>emsr_bitmap</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <value>xmm_input</value>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </enum>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       <defaults>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <spinlocks>4095</spinlocks>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <stimer_direct>on</stimer_direct>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <tlbflush_direct>on</tlbflush_direct>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <tlbflush_extended>on</tlbflush_extended>
Feb 26 20:36:00 compute-0 nova_compute[186588]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 26 20:36:00 compute-0 nova_compute[186588]:       </defaults>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     </hyperv>
Feb 26 20:36:00 compute-0 nova_compute[186588]:     <launchSecurity supported='no'/>
Feb 26 20:36:00 compute-0 nova_compute[186588]:   </features>
Feb 26 20:36:00 compute-0 nova_compute[186588]: </domainCapabilities>
Feb 26 20:36:00 compute-0 nova_compute[186588]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.069 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.069 186592 INFO nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Secure Boot support detected
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.071 186592 INFO nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.072 186592 INFO nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.080 186592 DEBUG nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.096 186592 INFO nova.virt.node [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Determined node identity 895ba9a7-707f-4e79-9130-ec9b9afa47ee from /var/lib/nova/compute_id
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.114 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Verified node 895ba9a7-707f-4e79-9130-ec9b9afa47ee matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.170 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.280 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.281 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.281 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.281 186592 DEBUG nova.compute.resource_tracker [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.461 186592 WARNING nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.462 186592 DEBUG nova.compute.resource_tracker [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6074MB free_disk=72.96935272216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.462 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.463 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.614 186592 DEBUG nova.compute.resource_tracker [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.614 186592 DEBUG nova.compute.resource_tracker [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.695 186592 DEBUG nova.scheduler.client.report [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Refreshing inventories for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.710 186592 DEBUG nova.scheduler.client.report [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Updating ProviderTree inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.710 186592 DEBUG nova.compute.provider_tree [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.726 186592 DEBUG nova.scheduler.client.report [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Refreshing aggregate associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.766 186592 DEBUG nova.scheduler.client.report [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Refreshing trait associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.790 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 26 20:36:00 compute-0 nova_compute[186588]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.791 186592 INFO nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] kernel doesn't support AMD SEV
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.791 186592 DEBUG nova.compute.provider_tree [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.791 186592 DEBUG nova.virt.libvirt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.810 186592 DEBUG nova.scheduler.client.report [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.827 186592 DEBUG nova.compute.resource_tracker [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.827 186592 DEBUG oslo_concurrency.lockutils [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.827 186592 DEBUG nova.service [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.868 186592 DEBUG nova.service [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 26 20:36:00 compute-0 nova_compute[186588]: 2026-02-26 20:36:00.869 186592 DEBUG nova.servicegroup.drivers.db [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 26 20:36:03 compute-0 sshd-session[186885]: Accepted publickey for zuul from 192.168.122.30 port 58936 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:36:03 compute-0 systemd-logind[825]: New session 25 of user zuul.
Feb 26 20:36:03 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 26 20:36:03 compute-0 sshd-session[186885]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:36:04 compute-0 python3.9[187038]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 26 20:36:05 compute-0 nova_compute[186588]: 2026-02-26 20:36:05.871 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:05 compute-0 nova_compute[186588]: 2026-02-26 20:36:05.929 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:05 compute-0 sudo[187192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llaxbxhwlrjimhfgmuycdfuvmmcjocmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138165.4082441-31-147275030218830/AnsiballZ_systemd_service.py'
Feb 26 20:36:05 compute-0 sudo[187192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:06 compute-0 python3.9[187195]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:36:06 compute-0 systemd[1]: Reloading.
Feb 26 20:36:06 compute-0 systemd-sysv-generator[187218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:36:06 compute-0 systemd-rc-local-generator[187215]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:36:06 compute-0 sudo[187192]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:07 compute-0 python3.9[187388]: ansible-ansible.builtin.service_facts Invoked
Feb 26 20:36:07 compute-0 network[187405]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 26 20:36:07 compute-0 network[187406]: 'network-scripts' will be removed from distribution in near future.
Feb 26 20:36:07 compute-0 network[187407]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 26 20:36:10 compute-0 sudo[187678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxwrqpqhtsyrxvppjjtkxeatzkbxeops ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138169.9036229-50-99621416446659/AnsiballZ_systemd_service.py'
Feb 26 20:36:10 compute-0 sudo[187678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:10 compute-0 python3.9[187681]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:36:10 compute-0 sudo[187678]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:11 compute-0 sudo[187832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iktoyhtyjlywmsmybqrxcdrscvkqorjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138170.800043-60-188641362326983/AnsiballZ_file.py'
Feb 26 20:36:11 compute-0 sudo[187832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:11 compute-0 python3.9[187835]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:11 compute-0 sudo[187832]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:11 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:36:11 compute-0 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 26 20:36:11 compute-0 sudo[187986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebutdzlzvceowjariwqfekbajpkukxyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138171.5106971-68-175006569005118/AnsiballZ_file.py'
Feb 26 20:36:11 compute-0 sudo[187986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:11 compute-0 python3.9[187989]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:11 compute-0 sudo[187986]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:12 compute-0 sudo[188139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxslrpclsnjawbvcksbupjgtikabtlcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138172.1166975-77-80192491717359/AnsiballZ_command.py'
Feb 26 20:36:12 compute-0 sudo[188139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:12 compute-0 python3.9[188142]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:36:12 compute-0 sudo[188139]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:13 compute-0 python3.9[188294]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:36:14 compute-0 sudo[188444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwywnoukjqcixqyhydcwyxgvgwhkkpkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138173.8674908-95-103036253267541/AnsiballZ_systemd_service.py'
Feb 26 20:36:14 compute-0 sudo[188444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:14 compute-0 python3.9[188447]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:36:14 compute-0 systemd[1]: Reloading.
Feb 26 20:36:14 compute-0 systemd-sysv-generator[188477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:36:14 compute-0 systemd-rc-local-generator[188473]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:36:14 compute-0 sudo[188444]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:15 compute-0 sudo[188640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbnrlnewgbyymsnqbrbzzxkysqrmyvdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138174.9324343-103-16825858098519/AnsiballZ_command.py'
Feb 26 20:36:15 compute-0 sudo[188640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:15 compute-0 python3.9[188643]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:36:15 compute-0 sudo[188640]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:15 compute-0 sudo[188794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmtxopsqardunapesvwwtqchaimoxhaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138175.6487029-112-254521359528228/AnsiballZ_file.py'
Feb 26 20:36:15 compute-0 sudo[188794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:16 compute-0 python3.9[188797]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:16 compute-0 sudo[188794]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:16 compute-0 python3.9[188947]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:17 compute-0 sudo[189099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keamyewhvvuzobswyrricmlcllywxjgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138177.0047386-128-255659485017511/AnsiballZ_group.py'
Feb 26 20:36:17 compute-0 sudo[189099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:17 compute-0 python3.9[189102]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 26 20:36:17 compute-0 sudo[189099]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:18 compute-0 sudo[189252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldypvomyeiplygxvfahptsymbdzuxfcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138177.856875-139-89316471030979/AnsiballZ_getent.py'
Feb 26 20:36:18 compute-0 sudo[189252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:18 compute-0 python3.9[189255]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 26 20:36:18 compute-0 sudo[189252]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:18 compute-0 sudo[189417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsqgelkszwaznsqilsckqmxbeqsauzwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138178.6480865-147-231309863668932/AnsiballZ_group.py'
Feb 26 20:36:18 compute-0 sudo[189417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:18 compute-0 podman[189380]: 2026-02-26 20:36:18.952978924 +0000 UTC m=+0.086183274 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 26 20:36:19 compute-0 python3.9[189422]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 26 20:36:19 compute-0 groupadd[189429]: group added to /etc/group: name=ceilometer, GID=42405
Feb 26 20:36:19 compute-0 groupadd[189429]: group added to /etc/gshadow: name=ceilometer
Feb 26 20:36:19 compute-0 groupadd[189429]: new group: name=ceilometer, GID=42405
Feb 26 20:36:19 compute-0 sudo[189417]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:19 compute-0 sudo[189584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzkbvweoyulwihykbsmwejtamkxecpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138179.2703693-155-167188086529308/AnsiballZ_user.py'
Feb 26 20:36:19 compute-0 sudo[189584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:19 compute-0 python3.9[189587]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 26 20:36:20 compute-0 useradd[189589]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 26 20:36:20 compute-0 useradd[189589]: add 'ceilometer' to group 'libvirt'
Feb 26 20:36:20 compute-0 useradd[189589]: add 'ceilometer' to shadow group 'libvirt'
Feb 26 20:36:20 compute-0 sudo[189584]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:21 compute-0 python3.9[189745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:21 compute-0 python3.9[189866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772138180.69382-181-105464287494005/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:22 compute-0 python3.9[190016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:22 compute-0 python3.9[190137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772138181.8144953-181-47195084866667/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:23 compute-0 python3.9[190287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:23 compute-0 python3.9[190408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772138182.7729206-181-159125411956745/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:24 compute-0 python3.9[190558]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:24 compute-0 python3.9[190710]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:25 compute-0 python3.9[190862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:25 compute-0 python3.9[190983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138184.7899976-240-18557559945160/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:26 compute-0 python3.9[191133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:26 compute-0 python3.9[191254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138185.741859-240-116336894717680/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:27 compute-0 podman[191378]: 2026-02-26 20:36:27.158876229 +0000 UTC m=+0.128110241 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:36:27 compute-0 python3.9[191417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:27 compute-0 python3.9[191552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138186.8208191-269-113209815887749/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:28 compute-0 python3.9[191702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:28 compute-0 python3.9[191823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138187.921446-285-45294729229219/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:29 compute-0 python3.9[191973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:29 compute-0 python3.9[192094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138188.885258-300-151868035127807/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:30 compute-0 python3.9[192244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:30 compute-0 python3.9[192365]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138189.8259254-315-187184978170951/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:31 compute-0 sudo[192515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjvgwncqhfvxaphigrseemllxjvjprfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138190.8303154-330-133191768490358/AnsiballZ_file.py'
Feb 26 20:36:31 compute-0 sudo[192515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:31 compute-0 python3.9[192518]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:31 compute-0 sudo[192515]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:31 compute-0 sudo[192668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmakabeofyhabiiuqvhxatuggspgvwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138191.503306-338-105049880365453/AnsiballZ_file.py'
Feb 26 20:36:31 compute-0 sudo[192668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:31 compute-0 python3.9[192671]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:31 compute-0 sudo[192668]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:32 compute-0 python3.9[192821]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:33 compute-0 python3.9[192973]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:33 compute-0 python3.9[193125]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:34 compute-0 sudo[193277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbusppszcsnslnxrwdmfojivfbajggai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138193.8765314-370-59604048332610/AnsiballZ_file.py'
Feb 26 20:36:34 compute-0 sudo[193277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:34 compute-0 python3.9[193280]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:34 compute-0 sudo[193277]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:34 compute-0 sudo[193430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnmobrgiwmdfvwizdldzehjcadufhpsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138194.5579436-378-65706981258313/AnsiballZ_systemd_service.py'
Feb 26 20:36:34 compute-0 sudo[193430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:35 compute-0 python3.9[193433]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:36:35 compute-0 systemd[1]: Reloading.
Feb 26 20:36:35 compute-0 systemd-rc-local-generator[193458]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:36:35 compute-0 systemd-sysv-generator[193463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:36:35 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 26 20:36:35 compute-0 sudo[193430]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:36 compute-0 sudo[193628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqfvllnnvrggavtztdqubqvgftuqclqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/AnsiballZ_stat.py'
Feb 26 20:36:36 compute-0 sudo[193628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:36 compute-0 python3.9[193631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:36 compute-0 sudo[193628]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:36 compute-0 sudo[193752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rofifhxhyccovarlwhfrqvydgvegkkad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/AnsiballZ_copy.py'
Feb 26 20:36:36 compute-0 sudo[193752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:36 compute-0 python3.9[193755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:36 compute-0 sudo[193752]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:37 compute-0 sudo[193829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxbkffltgwqkgomuukbbozwljbnmmemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/AnsiballZ_stat.py'
Feb 26 20:36:37 compute-0 sudo[193829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:37 compute-0 python3.9[193832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:37 compute-0 sudo[193829]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:37 compute-0 sudo[193953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojpuwikghtmcldbgldpguuczxlkopdli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/AnsiballZ_copy.py'
Feb 26 20:36:37 compute-0 sudo[193953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:37 compute-0 python3.9[193956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138195.908371-387-8894070509132/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:37 compute-0 sudo[193953]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:38 compute-0 sudo[194106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epclczpwtsqiumsswevsweoydbhtcldl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138198.2532957-419-193621867326560/AnsiballZ_file.py'
Feb 26 20:36:38 compute-0 sudo[194106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:38 compute-0 python3.9[194109]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:38 compute-0 sudo[194106]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:39 compute-0 sudo[194259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arskzonhofkyrbcyfrxlelxyokhrhjxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138198.926229-427-139661343415939/AnsiballZ_file.py'
Feb 26 20:36:39 compute-0 sudo[194259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:39 compute-0 python3.9[194262]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:39 compute-0 sudo[194259]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:39 compute-0 sudo[194412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbraqoyfkxzfzotefztuorizjbdxhjkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138199.5008285-435-6397625886528/AnsiballZ_stat.py'
Feb 26 20:36:39 compute-0 sudo[194412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:39 compute-0 python3.9[194415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:39 compute-0 sudo[194412]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:40 compute-0 sudo[194536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tabbgzzihmmzozxfdpeuolahvchjijpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138199.5008285-435-6397625886528/AnsiballZ_copy.py'
Feb 26 20:36:40 compute-0 sudo[194536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:40 compute-0 python3.9[194539]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138199.5008285-435-6397625886528/.source.json _original_basename=.8lq87i_y follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:40 compute-0 sudo[194536]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:40 compute-0 python3.9[194689]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:42 compute-0 sudo[195110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjcczqhihpagrtmnuzimlcehoezokeqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138202.373321-475-93766855623375/AnsiballZ_container_config_data.py'
Feb 26 20:36:42 compute-0 sudo[195110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:42 compute-0 python3.9[195113]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 26 20:36:43 compute-0 sudo[195110]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:43 compute-0 sudo[195263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-getcxseymgwmjxjeikazklxmpulnqxfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138203.32823-486-259561263885435/AnsiballZ_container_config_hash.py'
Feb 26 20:36:43 compute-0 sudo[195263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:43 compute-0 python3.9[195266]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:36:43 compute-0 sudo[195263]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:44 compute-0 sudo[195416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgygxwbecioweqthydbopnrayghmyrui ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138204.1627827-496-199813046564546/AnsiballZ_edpm_container_manage.py'
Feb 26 20:36:44 compute-0 sudo[195416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:44 compute-0 python3[195419]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:36:45 compute-0 podman[195458]: 2026-02-26 20:36:45.021540476 +0000 UTC m=+0.043775031 container create e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 26 20:36:45 compute-0 podman[195458]: 2026-02-26 20:36:44.999221129 +0000 UTC m=+0.021455704 image pull 85a67c09da63837d01bdd446430e96c969ea53b46c93eebb5caba564f6cc2835 quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested
Feb 26 20:36:45 compute-0 python3[195419]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested kolla_start
Feb 26 20:36:45 compute-0 sudo[195416]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:45 compute-0 sudo[195647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqalqprpmjuaqapepfrjjcwtmljshbfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138205.2657845-504-108287564798395/AnsiballZ_stat.py'
Feb 26 20:36:45 compute-0 sudo[195647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:45 compute-0 python3.9[195650]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:45 compute-0 sudo[195647]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:46 compute-0 sudo[195802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umyhlnuqwkykiynfnoqdkpktbynyxfkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138205.9754806-513-170674915742940/AnsiballZ_file.py'
Feb 26 20:36:46 compute-0 sudo[195802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:46 compute-0 python3.9[195805]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:46 compute-0 sudo[195802]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:36:46.498 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:36:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:36:46.500 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:36:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:36:46.501 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:36:46 compute-0 sudo[195879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyxilllffgvnwlxkhniwgtltutvzmnvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138205.9754806-513-170674915742940/AnsiballZ_stat.py'
Feb 26 20:36:46 compute-0 sudo[195879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:46 compute-0 python3.9[195882]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:36:46 compute-0 sudo[195879]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:47 compute-0 sudo[196031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weatxaterbhqogykesfohnhreepypukp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138206.9074433-513-118108697585067/AnsiballZ_copy.py'
Feb 26 20:36:47 compute-0 sudo[196031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:47 compute-0 python3.9[196034]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772138206.9074433-513-118108697585067/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:47 compute-0 sudo[196031]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:48 compute-0 sudo[196108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jocijtxmtslaxwusavlsotidwtbwselb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138206.9074433-513-118108697585067/AnsiballZ_systemd.py'
Feb 26 20:36:48 compute-0 sudo[196108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:48 compute-0 python3.9[196111]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:36:48 compute-0 systemd[1]: Reloading.
Feb 26 20:36:48 compute-0 systemd-rc-local-generator[196139]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:36:48 compute-0 systemd-sysv-generator[196142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:36:48 compute-0 sudo[196108]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:48 compute-0 sudo[196226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydvytnegeeqvwdnsjqkhkfxnvhtyqfkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138206.9074433-513-118108697585067/AnsiballZ_systemd.py'
Feb 26 20:36:48 compute-0 sudo[196226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:49 compute-0 python3.9[196229]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:36:49 compute-0 systemd[1]: Reloading.
Feb 26 20:36:49 compute-0 podman[196231]: 2026-02-26 20:36:49.330220148 +0000 UTC m=+0.079462039 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:36:49 compute-0 systemd-rc-local-generator[196274]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:36:49 compute-0 systemd-sysv-generator[196280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:36:49 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Feb 26 20:36:49 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:36:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/640f81719b4cf3b828944bfd745518b4b391d346ff1dc3d3971b8d756687f4da/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Feb 26 20:36:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/640f81719b4cf3b828944bfd745518b4b391d346ff1dc3d3971b8d756687f4da/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 26 20:36:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/640f81719b4cf3b828944bfd745518b4b391d346ff1dc3d3971b8d756687f4da/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 26 20:36:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/640f81719b4cf3b828944bfd745518b4b391d346ff1dc3d3971b8d756687f4da/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 26 20:36:49 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd.
Feb 26 20:36:49 compute-0 podman[196295]: 2026-02-26 20:36:49.735273444 +0000 UTC m=+0.131435314 container init e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + sudo -E kolla_set_configs
Feb 26 20:36:49 compute-0 podman[196295]: 2026-02-26 20:36:49.767142914 +0000 UTC m=+0.163304764 container start e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:36:49 compute-0 sudo[196318]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: sudo: unable to send audit message: Operation not permitted
Feb 26 20:36:49 compute-0 podman[196295]: ceilometer_agent_compute
Feb 26 20:36:49 compute-0 sudo[196318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 26 20:36:49 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Feb 26 20:36:49 compute-0 sudo[196226]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Validating config file
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Copying service configuration files
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: INFO:__main__:Writing out command to execute
Feb 26 20:36:49 compute-0 sudo[196318]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: ++ cat /run_command
Feb 26 20:36:49 compute-0 podman[196319]: 2026-02-26 20:36:49.83530757 +0000 UTC m=+0.056319999 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + ARGS=
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + sudo kolla_copy_cacerts
Feb 26 20:36:49 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Main process exited, code=exited, status=1/FAILURE
Feb 26 20:36:49 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Failed with result 'exit-code'.
Feb 26 20:36:49 compute-0 sudo[196348]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: sudo: unable to send audit message: Operation not permitted
Feb 26 20:36:49 compute-0 sudo[196348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 26 20:36:49 compute-0 sudo[196348]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + [[ ! -n '' ]]
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + . kolla_extend_start
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + umask 0022
Feb 26 20:36:49 compute-0 ceilometer_agent_compute[196312]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 26 20:36:50 compute-0 python3.9[196491]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.657 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:45
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.657 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.657 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.657 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.658 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 WARNING oslo_config.cfg [-] Deprecated: Option "tenant_name_discovery" from group "DEFAULT" is deprecated. Use option "identity_name_discovery" from group "DEFAULT".
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.659 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.660 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.661 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.662 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.663 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.664 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.665 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.666 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.667 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.668 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.669 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.670 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.690 12 INFO ceilometer.polling.manager [-] Starting heartbeat child service. Listening on /var/lib/ceilometer/ceilometer-compute.socket
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.690 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.690 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.691 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.692 12 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.693 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.694 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.695 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.696 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.697 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.698 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.701 12 DEBUG cotyledon._service [-] Run service AgentHeartBeatManager(0) [12] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.702 12 DEBUG ceilometer.polling.manager [-] Started heartbeat child process. run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:519
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.703 12 DEBUG ceilometer.polling.manager [-] Started heartbeat update thread _read_queue /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:522
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.704 12 DEBUG ceilometer.polling.manager [-] Started heartbeat reporting thread _report_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:527
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.895 14 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/utils.py:96
Feb 26 20:36:50 compute-0 rsyslogd[1016]: imjournal from <np0005631999:ceilometer_agent_compute>: begin to drop messages due to rate-limiting
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.902 14 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.903 14 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 26 20:36:50 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:50.903 14 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.036 14 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.037 14 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.038 14 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.039 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.040 14 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.041 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.042 14 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.043 14 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.044 14 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.044 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.044 14 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.044 14 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.045 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.046 14 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.047 14 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.048 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.049 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.050 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.051 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.052 14 DEBUG cotyledon._service [-] Run service AgentManager(0) [14] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.054 14 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.12/site-packages/ceilometer/agent.py:64
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.068 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.069 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.069 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.070 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.070 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.071 14 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/utils.py:96
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.081 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.081 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:36:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:36:51 compute-0 sudo[196654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxvaftkzqehcisbxwcxrjvuvqhqcwqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138210.9798143-558-54904003757419/AnsiballZ_stat.py'
Feb 26 20:36:51 compute-0 sudo[196654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:51 compute-0 python3.9[196657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:51 compute-0 sudo[196654]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:51 compute-0 sudo[196780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rflwkfbkcpabrnrmluhanjmpguzdcsyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138210.9798143-558-54904003757419/AnsiballZ_copy.py'
Feb 26 20:36:51 compute-0 sudo[196780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:51 compute-0 python3.9[196783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138210.9798143-558-54904003757419/.source.yaml _original_basename=.zvlhzst4 follow=False checksum=c46e6f62c96dbf5f39b5046b64f97534d12d7df5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:51 compute-0 sudo[196780]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:52 compute-0 sudo[196933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bunvlwtvdheorgiwoljlmhwpxrxaxefb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138212.1505322-573-115399570038384/AnsiballZ_stat.py'
Feb 26 20:36:52 compute-0 sudo[196933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:52 compute-0 python3.9[196936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:52 compute-0 sudo[196933]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:52 compute-0 sudo[197057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpcqnjbihwgqkjiyizamupoegmwqusou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138212.1505322-573-115399570038384/AnsiballZ_copy.py'
Feb 26 20:36:52 compute-0 sudo[197057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:53 compute-0 python3.9[197060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138212.1505322-573-115399570038384/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:53 compute-0 sudo[197057]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:53 compute-0 sudo[197210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ichadwejihgriaqxfwvcilzjeccbkhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138213.613569-594-154632493595416/AnsiballZ_file.py'
Feb 26 20:36:53 compute-0 sudo[197210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:54 compute-0 python3.9[197213]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:54 compute-0 sudo[197210]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:54 compute-0 sudo[197363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okqrcqkaypyyonsmoqjfvhprqjncbvrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138214.2838004-602-183076250257905/AnsiballZ_file.py'
Feb 26 20:36:54 compute-0 sudo[197363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:54 compute-0 python3.9[197366]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:36:54 compute-0 sudo[197363]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:55 compute-0 sudo[197516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cndwfuxsrthsuwtbqkapistbypgkecze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138214.9556963-610-236441486050197/AnsiballZ_stat.py'
Feb 26 20:36:55 compute-0 sudo[197516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:55 compute-0 python3.9[197519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:36:55 compute-0 sudo[197516]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:55 compute-0 sudo[197595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrrtnkjoemmphnxtbbsgnffjdvnpihye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138214.9556963-610-236441486050197/AnsiballZ_file.py'
Feb 26 20:36:55 compute-0 sudo[197595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:55 compute-0 python3.9[197598]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.a6xqzjj5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:56 compute-0 sudo[197595]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:56 compute-0 python3.9[197748]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:36:57 compute-0 podman[197993]: 2026-02-26 20:36:57.575701935 +0000 UTC m=+0.112284747 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 26 20:36:58 compute-0 sudo[198195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iycdshipgviwdjtsrvaprgzbpiwcnwvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138217.9631178-647-167927979320875/AnsiballZ_container_config_data.py'
Feb 26 20:36:58 compute-0 sudo[198195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:58 compute-0 python3.9[198198]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 26 20:36:58 compute-0 sudo[198195]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:59 compute-0 sudo[198348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unaktpxyvbshlwleetervxnjmzepfdxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138218.7366939-658-247562296918680/AnsiballZ_container_config_hash.py'
Feb 26 20:36:59 compute-0 sudo[198348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.063 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.063 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.063 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.064 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.080 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.080 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.080 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.081 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.081 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.081 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.082 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.082 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.082 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.105 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.105 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.105 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.105 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:36:59 compute-0 python3.9[198351]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.265 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.267 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5938MB free_disk=72.968505859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.267 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.268 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:36:59 compute-0 sudo[198348]: pam_unix(sudo:session): session closed for user root
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.322 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.322 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.342 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.366 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.367 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:36:59 compute-0 nova_compute[186588]: 2026-02-26 20:36:59.368 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:36:59 compute-0 sudo[198501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmedwkxxhsjfxhfboukyygqkcbgpmgwj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138219.5380404-668-35132844821549/AnsiballZ_edpm_container_manage.py'
Feb 26 20:36:59 compute-0 sudo[198501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:36:59 compute-0 python3[198504]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:37:00 compute-0 podman[198541]: 2026-02-26 20:37:00.137260064 +0000 UTC m=+0.049145195 container create 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:37:00 compute-0 podman[198541]: 2026-02-26 20:37:00.116380767 +0000 UTC m=+0.028265918 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 26 20:37:00 compute-0 python3[198504]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /:/rootfs:ro --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl --path.rootfs=/rootfs
Feb 26 20:37:00 compute-0 sudo[198501]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:00 compute-0 sudo[198729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jonbkfondgoiozffbcqbjtumhmzgpyty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138220.3757524-676-252573896588458/AnsiballZ_stat.py'
Feb 26 20:37:00 compute-0 sudo[198729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:00 compute-0 python3.9[198732]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:00 compute-0 sudo[198729]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:01 compute-0 sudo[198884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zusxegqbrjcglllwscbrkrlrpjykckpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138220.9747806-685-146556231706834/AnsiballZ_file.py'
Feb 26 20:37:01 compute-0 sudo[198884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:01 compute-0 python3.9[198887]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:01 compute-0 sudo[198884]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:01 compute-0 sudo[198961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocyfagsevokcjbdxitggfacfpcnhapeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138220.9747806-685-146556231706834/AnsiballZ_stat.py'
Feb 26 20:37:01 compute-0 sudo[198961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:01 compute-0 python3.9[198964]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:01 compute-0 sudo[198961]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:02 compute-0 sudo[199113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-newzfciimxplznenbpuxhgpwmokabces ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138221.8959756-685-40344371930323/AnsiballZ_copy.py'
Feb 26 20:37:02 compute-0 sudo[199113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:02 compute-0 python3.9[199116]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772138221.8959756-685-40344371930323/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:02 compute-0 sudo[199113]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:02 compute-0 sudo[199190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siejgvwmyciqhdydynjgxpyjpbmhiurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138221.8959756-685-40344371930323/AnsiballZ_systemd.py'
Feb 26 20:37:02 compute-0 sudo[199190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:03 compute-0 python3.9[199193]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:37:03 compute-0 systemd[1]: Reloading.
Feb 26 20:37:03 compute-0 systemd-sysv-generator[199225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:03 compute-0 systemd-rc-local-generator[199221]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:03 compute-0 sudo[199190]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:03 compute-0 sudo[199309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xykygzrcrnggzslalogjnqmbmsyjwgyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138221.8959756-685-40344371930323/AnsiballZ_systemd.py'
Feb 26 20:37:03 compute-0 sudo[199309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:03 compute-0 python3.9[199312]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:37:04 compute-0 systemd[1]: Reloading.
Feb 26 20:37:04 compute-0 systemd-sysv-generator[199346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:04 compute-0 systemd-rc-local-generator[199341]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:04 compute-0 systemd[1]: Starting node_exporter container...
Feb 26 20:37:04 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:37:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8de50acea4ce4d38272a54b70eb980238956c73b647c2fc922360ef4de4aae00/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8de50acea4ce4d38272a54b70eb980238956c73b647c2fc922360ef4de4aae00/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:04 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33.
Feb 26 20:37:04 compute-0 podman[199359]: 2026-02-26 20:37:04.467693017 +0000 UTC m=+0.153268118 container init 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.492Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.492Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.492Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.493Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.493Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.493Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.493Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=arp
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=bcache
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=bonding
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=cpu
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=edac
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=filefd
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=netclass
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=netdev
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=netstat
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=nfs
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=nvme
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=softnet
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=systemd
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.494Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.495Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.495Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.495Z caller=node_exporter.go:117 level=info collector=xfs
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.495Z caller=node_exporter.go:117 level=info collector=zfs
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.496Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 26 20:37:04 compute-0 node_exporter[199374]: ts=2026-02-26T20:37:04.496Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Feb 26 20:37:04 compute-0 podman[199359]: 2026-02-26 20:37:04.507035 +0000 UTC m=+0.192610051 container start 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:37:04 compute-0 podman[199359]: node_exporter
Feb 26 20:37:04 compute-0 systemd[1]: Started node_exporter container.
Feb 26 20:37:04 compute-0 sudo[199309]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:04 compute-0 podman[199383]: 2026-02-26 20:37:04.598422769 +0000 UTC m=+0.080043096 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:37:05 compute-0 python3.9[199557]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:37:06 compute-0 sudo[199707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opffifpkthxufucaokzdubkyhkhdhlzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138225.8160422-730-202983473794312/AnsiballZ_stat.py'
Feb 26 20:37:06 compute-0 sudo[199707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:06 compute-0 python3.9[199710]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:06 compute-0 sudo[199707]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:06 compute-0 sudo[199833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uftivgfqvarexbezqrfrfhtyrzwcinxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138225.8160422-730-202983473794312/AnsiballZ_copy.py'
Feb 26 20:37:06 compute-0 sudo[199833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:06 compute-0 python3.9[199836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138225.8160422-730-202983473794312/.source.yaml _original_basename=.iy5ibw3n follow=False checksum=6660b211a3b61007d9b24bffcd528683039ab38e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:06 compute-0 sudo[199833]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:07 compute-0 sudo[199986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkllvzvbbfisakowhomkgmmmqutmkkpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138227.0939684-745-236227324390855/AnsiballZ_stat.py'
Feb 26 20:37:07 compute-0 sudo[199986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:07 compute-0 python3.9[199989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:07 compute-0 sudo[199986]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:07 compute-0 sudo[200110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieyszvbrjoibognnselzmfkknoegrlsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138227.0939684-745-236227324390855/AnsiballZ_copy.py'
Feb 26 20:37:07 compute-0 sudo[200110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:08 compute-0 python3.9[200113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138227.0939684-745-236227324390855/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:37:08 compute-0 sudo[200110]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:08 compute-0 sudo[200263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvgxlcilyzmxawfohoxxrkxiojandcvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138228.7112386-766-249022143882018/AnsiballZ_file.py'
Feb 26 20:37:08 compute-0 sudo[200263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:09 compute-0 python3.9[200266]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:09 compute-0 sudo[200263]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:09 compute-0 sudo[200416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmidlblcfbduinwlnvghejowcpxkvvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138229.39326-774-120646342926377/AnsiballZ_file.py'
Feb 26 20:37:09 compute-0 sudo[200416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:09 compute-0 python3.9[200419]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:37:09 compute-0 sudo[200416]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:10 compute-0 sudo[200569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihqrdpznkvujlpajhlpevsdcjnylzeem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138230.1025784-782-7272994819357/AnsiballZ_stat.py'
Feb 26 20:37:10 compute-0 sudo[200569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:10 compute-0 python3.9[200572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:10 compute-0 sudo[200569]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:10 compute-0 sudo[200648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khojaouzivwafbddjampkaehkbeukqkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138230.1025784-782-7272994819357/AnsiballZ_file.py'
Feb 26 20:37:10 compute-0 sudo[200648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:11 compute-0 python3.9[200651]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.88s111eo recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:11 compute-0 sudo[200648]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:11 compute-0 python3.9[200801]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:13 compute-0 sudo[201222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyfijlsoikpoqsfdwjmpbuqetvbmfugh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138233.2235117-819-111325008229353/AnsiballZ_container_config_data.py'
Feb 26 20:37:13 compute-0 sudo[201222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:13 compute-0 python3.9[201225]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 26 20:37:13 compute-0 sudo[201222]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:14 compute-0 sudo[201375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubodnyunnnspbimuqlbflxvskpzhwwcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138233.9549947-830-149981047675976/AnsiballZ_container_config_hash.py'
Feb 26 20:37:14 compute-0 sudo[201375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:14 compute-0 python3.9[201378]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:37:14 compute-0 sudo[201375]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:14 compute-0 sudo[201528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpfcbyxyxsoothrbaazsfgdapkquehbk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138234.7506084-840-51731250638073/AnsiballZ_edpm_container_manage.py'
Feb 26 20:37:14 compute-0 sudo[201528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:15 compute-0 python3[201531]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:37:16 compute-0 podman[201544]: 2026-02-26 20:37:16.456346549 +0000 UTC m=+1.229816365 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 26 20:37:16 compute-0 podman[201642]: 2026-02-26 20:37:16.619364914 +0000 UTC m=+0.062190787 container create 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Feb 26 20:37:16 compute-0 podman[201642]: 2026-02-26 20:37:16.589699367 +0000 UTC m=+0.032525290 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 26 20:37:16 compute-0 python3[201531]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 26 20:37:16 compute-0 sudo[201528]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:17 compute-0 sudo[201830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgnbtfpbmnxwiygivwaseujphdtgawyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138236.9290946-848-111057706207825/AnsiballZ_stat.py'
Feb 26 20:37:17 compute-0 sudo[201830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:17 compute-0 python3.9[201833]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:17 compute-0 sudo[201830]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:17 compute-0 sudo[201985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmeasfvzlamkoddjjyzqbnqlucrkovtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138237.5602276-857-78811038478658/AnsiballZ_file.py'
Feb 26 20:37:17 compute-0 sudo[201985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:17 compute-0 python3.9[201988]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:18 compute-0 sudo[201985]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:18 compute-0 sudo[202062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psuoeyuyshmxtcldwkqocedkjzvsqmxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138237.5602276-857-78811038478658/AnsiballZ_stat.py'
Feb 26 20:37:18 compute-0 sudo[202062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:18 compute-0 python3.9[202065]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:18 compute-0 sudo[202062]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:18 compute-0 sudo[202214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icrvoekfpbgqflrcsyvdvcozcogonjan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138238.4848015-857-276894661104965/AnsiballZ_copy.py'
Feb 26 20:37:18 compute-0 sudo[202214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:19 compute-0 python3.9[202217]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772138238.4848015-857-276894661104965/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:19 compute-0 sudo[202214]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:19 compute-0 sudo[202291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tokdzsflsytwqobblnfnzitrlrvavkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138238.4848015-857-276894661104965/AnsiballZ_systemd.py'
Feb 26 20:37:19 compute-0 sudo[202291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:19 compute-0 python3.9[202294]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:37:19 compute-0 systemd[1]: Reloading.
Feb 26 20:37:19 compute-0 systemd-rc-local-generator[202332]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:19 compute-0 systemd-sysv-generator[202338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:19 compute-0 podman[202296]: 2026-02-26 20:37:19.729835805 +0000 UTC m=+0.100471530 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260223, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 26 20:37:19 compute-0 sudo[202291]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:19 compute-0 podman[202356]: 2026-02-26 20:37:19.995660364 +0000 UTC m=+0.061214258 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 26 20:37:20 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Main process exited, code=exited, status=1/FAILURE
Feb 26 20:37:20 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Failed with result 'exit-code'.
Feb 26 20:37:20 compute-0 sudo[202449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geyhbskoaogaiazpzcisbnlmbeestatn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138238.4848015-857-276894661104965/AnsiballZ_systemd.py'
Feb 26 20:37:20 compute-0 sudo[202449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:20 compute-0 python3.9[202452]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:37:20 compute-0 systemd[1]: Reloading.
Feb 26 20:37:20 compute-0 systemd-rc-local-generator[202482]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:20 compute-0 systemd-sysv-generator[202485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:20 compute-0 systemd[1]: Starting podman_exporter container...
Feb 26 20:37:20 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:37:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bca865956f485b2e84d694d4b08a5c0e155bb80a36e1b283b7ea4896dd92724f/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bca865956f485b2e84d694d4b08a5c0e155bb80a36e1b283b7ea4896dd92724f/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157.
Feb 26 20:37:21 compute-0 podman[202500]: 2026-02-26 20:37:21.020338471 +0000 UTC m=+0.128638264 container init 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.037Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.037Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.037Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.037Z caller=handler.go:105 level=info collector=container
Feb 26 20:37:21 compute-0 podman[202500]: 2026-02-26 20:37:21.04164574 +0000 UTC m=+0.149945533 container start 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 26 20:37:21 compute-0 podman[202500]: podman_exporter
Feb 26 20:37:21 compute-0 systemd[1]: Starting Podman API Service...
Feb 26 20:37:21 compute-0 systemd[1]: Started Podman API Service.
Feb 26 20:37:21 compute-0 systemd[1]: Started podman_exporter container.
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="Setting parallel job count to 25"
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="Using sqlite as database backend"
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 26 20:37:21 compute-0 podman[202527]: @ - - [26/Feb/2026:20:37:21 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 26 20:37:21 compute-0 sudo[202449]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:21 compute-0 podman[202527]: time="2026-02-26T20:37:21Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:37:21 compute-0 podman[202527]: @ - - [26/Feb/2026:20:37:21 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18632 "" "Go-http-client/1.1"
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.111Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.112Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 26 20:37:21 compute-0 podman_exporter[202516]: ts=2026-02-26T20:37:21.112Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 26 20:37:21 compute-0 podman[202525]: 2026-02-26 20:37:21.113630956 +0000 UTC m=+0.063105754 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:37:21 compute-0 systemd[1]: 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157-1a605536cc1b2e60.service: Main process exited, code=exited, status=1/FAILURE
Feb 26 20:37:21 compute-0 systemd[1]: 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157-1a605536cc1b2e60.service: Failed with result 'exit-code'.
Feb 26 20:37:21 compute-0 python3.9[202711]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:37:22 compute-0 sudo[202861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdwixdgqkvckrkqunvddoegndcniptob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138242.055902-902-114070558891225/AnsiballZ_stat.py'
Feb 26 20:37:22 compute-0 sudo[202861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:22 compute-0 python3.9[202864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:22 compute-0 sudo[202861]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:22 compute-0 sudo[202987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inhekecpiaccbhswodvcfcpahyidpijf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138242.055902-902-114070558891225/AnsiballZ_copy.py'
Feb 26 20:37:22 compute-0 sudo[202987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:22 compute-0 python3.9[202990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138242.055902-902-114070558891225/.source.yaml _original_basename=.ssfigtok follow=False checksum=de6e1012da0b1baa7630460c4bf3fef4b6804d41 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:23 compute-0 sudo[202987]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:23 compute-0 sudo[203140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cruzvobyczojvzhcpyicozxshtxefhmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138243.1560802-917-231088889953654/AnsiballZ_stat.py'
Feb 26 20:37:23 compute-0 sudo[203140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:23 compute-0 python3.9[203143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:23 compute-0 sudo[203140]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:23 compute-0 sudo[203264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nixgiwuvhnhgszbftpseefvjbtqheeoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138243.1560802-917-231088889953654/AnsiballZ_copy.py'
Feb 26 20:37:23 compute-0 sudo[203264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:24 compute-0 python3.9[203267]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772138243.1560802-917-231088889953654/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:37:24 compute-0 sudo[203264]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:24 compute-0 sudo[203417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evnedbhrvjromokexcjuosajbaduwabs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138244.5920293-938-236082249747861/AnsiballZ_file.py'
Feb 26 20:37:24 compute-0 sudo[203417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:24 compute-0 python3.9[203420]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:25 compute-0 sudo[203417]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:25 compute-0 sudo[203570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtprhyedzcgbhumkbuweclikxfgjlkhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138245.142625-946-234066158761605/AnsiballZ_file.py'
Feb 26 20:37:25 compute-0 sudo[203570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:25 compute-0 python3.9[203573]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 26 20:37:25 compute-0 sudo[203570]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:25 compute-0 sudo[203723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psboieatggejjalbuzrdlbtqslljzudx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138245.6929915-954-49846163584400/AnsiballZ_stat.py'
Feb 26 20:37:25 compute-0 sudo[203723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:26 compute-0 python3.9[203726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:26 compute-0 sudo[203723]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:26 compute-0 sudo[203802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asbyfgboibadywpmnexuczohpudmefss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138245.6929915-954-49846163584400/AnsiballZ_file.py'
Feb 26 20:37:26 compute-0 sudo[203802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:26 compute-0 python3.9[203805]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.svqgzw01 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:26 compute-0 sudo[203802]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:27 compute-0 python3.9[203955]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:27 compute-0 podman[204079]: 2026-02-26 20:37:27.747801026 +0000 UTC m=+0.093802980 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 26 20:37:28 compute-0 sudo[204402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbyvukowhznlzzyxadydrluhvvkxjqwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138248.617353-991-76670858680667/AnsiballZ_container_config_data.py'
Feb 26 20:37:28 compute-0 sudo[204402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:29 compute-0 python3.9[204405]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 26 20:37:29 compute-0 sudo[204402]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:29 compute-0 sudo[204555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqtntjiwtcerowfmudylymjpoizrpyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138249.3827126-1002-189693194242803/AnsiballZ_container_config_hash.py'
Feb 26 20:37:29 compute-0 sudo[204555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:29 compute-0 python3.9[204558]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 26 20:37:29 compute-0 sudo[204555]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:30 compute-0 auditd[719]: Audit daemon rotating log files
Feb 26 20:37:30 compute-0 sudo[204708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lczqnyvxvmjxspauwdofpzghshpyectf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138250.17009-1012-151207842422897/AnsiballZ_edpm_container_manage.py'
Feb 26 20:37:30 compute-0 sudo[204708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:30 compute-0 python3[204711]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 26 20:37:33 compute-0 podman[204725]: 2026-02-26 20:37:33.879578523 +0000 UTC m=+3.159398280 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 26 20:37:33 compute-0 podman[204823]: 2026-02-26 20:37:33.996680735 +0000 UTC m=+0.040205718 container create ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 26 20:37:33 compute-0 podman[204823]: 2026-02-26 20:37:33.974164702 +0000 UTC m=+0.017689705 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 26 20:37:34 compute-0 python3[204711]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 26 20:37:34 compute-0 sudo[204708]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:34 compute-0 sudo[205011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aukybdigymrgmctaegxlghipfypdxayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138254.234945-1020-14515399609468/AnsiballZ_stat.py'
Feb 26 20:37:34 compute-0 sudo[205011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:34 compute-0 python3.9[205014]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:34 compute-0 sudo[205011]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:34 compute-0 podman[205017]: 2026-02-26 20:37:34.801889556 +0000 UTC m=+0.084059591 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:37:35 compute-0 sudo[205191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxoibssdpkkvsvotkqaygtdvetzsvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138254.9279013-1029-183763728239477/AnsiballZ_file.py'
Feb 26 20:37:35 compute-0 sudo[205191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:35 compute-0 python3.9[205194]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:35 compute-0 sudo[205191]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:35 compute-0 sudo[205268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hipsgbtbhbpbbrneoopiqraxlpccztai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138254.9279013-1029-183763728239477/AnsiballZ_stat.py'
Feb 26 20:37:35 compute-0 sudo[205268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:35 compute-0 python3.9[205271]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:37:35 compute-0 sudo[205268]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:36 compute-0 sudo[205420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdnvgyyynxhzjigbzsfzzmfclzecghgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138256.0322897-1029-266043571506491/AnsiballZ_copy.py'
Feb 26 20:37:36 compute-0 sudo[205420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:36 compute-0 python3.9[205423]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772138256.0322897-1029-266043571506491/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:36 compute-0 sudo[205420]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:36 compute-0 sudo[205497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osojhovtlslmbgevudemabfkxwmuwhzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138256.0322897-1029-266043571506491/AnsiballZ_systemd.py'
Feb 26 20:37:36 compute-0 sudo[205497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:37 compute-0 python3.9[205500]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 26 20:37:37 compute-0 systemd[1]: Reloading.
Feb 26 20:37:37 compute-0 systemd-rc-local-generator[205526]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:37 compute-0 systemd-sysv-generator[205530]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:37 compute-0 sudo[205497]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:37 compute-0 sudo[205616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfocdcinxbdemrrevgqqczfvbtuehhfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138256.0322897-1029-266043571506491/AnsiballZ_systemd.py'
Feb 26 20:37:37 compute-0 sudo[205616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:38 compute-0 python3.9[205619]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 26 20:37:38 compute-0 systemd[1]: Reloading.
Feb 26 20:37:38 compute-0 systemd-rc-local-generator[205656]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 26 20:37:38 compute-0 systemd-sysv-generator[205660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 26 20:37:38 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 26 20:37:38 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:37:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7e10ec932f390b90aedab9c636d9e5a41b22c9fc56271f4ad956545bfbbe2f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7e10ec932f390b90aedab9c636d9e5a41b22c9fc56271f4ad956545bfbbe2f/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7e10ec932f390b90aedab9c636d9e5a41b22c9fc56271f4ad956545bfbbe2f/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 26 20:37:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0.
Feb 26 20:37:38 compute-0 podman[205666]: 2026-02-26 20:37:38.619634162 +0000 UTC m=+0.135534082 container init ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *bridge.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *coverage.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *datapath.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *iface.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *memory.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *ovn.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *pmd_perf.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *pmd_rxq.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: INFO    20:37:38 main.go:48: registering *vswitch.Collector
Feb 26 20:37:38 compute-0 openstack_network_exporter[205682]: NOTICE  20:37:38 main.go:76: listening on https://:9105/metrics
Feb 26 20:37:38 compute-0 podman[205666]: 2026-02-26 20:37:38.647037874 +0000 UTC m=+0.162937774 container start ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7)
Feb 26 20:37:38 compute-0 podman[205666]: openstack_network_exporter
Feb 26 20:37:38 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 26 20:37:38 compute-0 sudo[205616]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:38 compute-0 podman[205687]: 2026-02-26 20:37:38.720859272 +0000 UTC m=+0.060823388 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, vcs-type=git, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 26 20:37:39 compute-0 python3.9[205864]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 26 20:37:40 compute-0 sudo[206014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqbrkpapfnwogoavilfmrbnguoctgqym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138259.8246393-1074-20231522758096/AnsiballZ_stat.py'
Feb 26 20:37:40 compute-0 sudo[206014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:40 compute-0 python3.9[206017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:40 compute-0 sudo[206014]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:40 compute-0 sudo[206140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvidqruceuxrfuabqpqzlkvijwrfgbca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138259.8246393-1074-20231522758096/AnsiballZ_copy.py'
Feb 26 20:37:40 compute-0 sudo[206140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:40 compute-0 python3.9[206143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138259.8246393-1074-20231522758096/.source.yaml _original_basename=.dlu5z27z follow=False checksum=e86787c9a065080d09cebf5c85a8f4691204da1f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:40 compute-0 sudo[206140]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:41 compute-0 sudo[206293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdzkgrccsulkcgfaitsimyuyaauqqjxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138261.0599124-1089-262204718246436/AnsiballZ_find.py'
Feb 26 20:37:41 compute-0 sudo[206293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:41 compute-0 python3.9[206296]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 26 20:37:41 compute-0 sudo[206293]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:42 compute-0 sudo[206446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkfebnkucsydccpluekxdtnlvwcelzhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138261.8819394-1099-153684732507529/AnsiballZ_podman_container_info.py'
Feb 26 20:37:42 compute-0 sudo[206446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:42 compute-0 python3.9[206449]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 26 20:37:42 compute-0 sudo[206446]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:43 compute-0 sudo[206612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnbnlaqimkjivtjfgywgtbdmkjeccuap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138262.6827173-1107-134662043519026/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:43 compute-0 sudo[206612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:43 compute-0 python3.9[206615]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:43 compute-0 systemd[1]: Started libpod-conmon-c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25.scope.
Feb 26 20:37:43 compute-0 podman[206616]: 2026-02-26 20:37:43.36326148 +0000 UTC m=+0.077974976 container exec c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 26 20:37:43 compute-0 podman[206616]: 2026-02-26 20:37:43.398441602 +0000 UTC m=+0.113155058 container exec_died c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 26 20:37:43 compute-0 systemd[1]: libpod-conmon-c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25.scope: Deactivated successfully.
Feb 26 20:37:43 compute-0 sudo[206612]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:43 compute-0 sudo[206798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibbxwpwesxmyqkhkvkydfarkxgjdewdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138263.6134477-1115-165198695878963/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:43 compute-0 sudo[206798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:44 compute-0 python3.9[206801]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:44 compute-0 systemd[1]: Started libpod-conmon-c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25.scope.
Feb 26 20:37:44 compute-0 podman[206802]: 2026-02-26 20:37:44.137197224 +0000 UTC m=+0.074077948 container exec c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:37:44 compute-0 podman[206802]: 2026-02-26 20:37:44.171244675 +0000 UTC m=+0.108125359 container exec_died c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 26 20:37:44 compute-0 systemd[1]: libpod-conmon-c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25.scope: Deactivated successfully.
Feb 26 20:37:44 compute-0 sudo[206798]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:44 compute-0 sudo[206983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dktcsasyovfviaayfmdnddzhxsblymis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138264.3471825-1123-201546707537843/AnsiballZ_file.py'
Feb 26 20:37:44 compute-0 sudo[206983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:44 compute-0 python3.9[206986]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:44 compute-0 sudo[206983]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:45 compute-0 sudo[207136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vekozedbipvpuwpgzlfwpczkdybfqdvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138265.0406063-1132-258868693817070/AnsiballZ_podman_container_info.py'
Feb 26 20:37:45 compute-0 sudo[207136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:45 compute-0 python3.9[207139]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 26 20:37:45 compute-0 sudo[207136]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:45 compute-0 sudo[207303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bksciraephycdqpjrwjrtafvpmnojzvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138265.7532551-1140-175116763002963/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:45 compute-0 sudo[207303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:46 compute-0 python3.9[207306]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:46 compute-0 systemd[1]: Started libpod-conmon-a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0.scope.
Feb 26 20:37:46 compute-0 podman[207307]: 2026-02-26 20:37:46.186082261 +0000 UTC m=+0.062315734 container exec a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 26 20:37:46 compute-0 podman[207326]: 2026-02-26 20:37:46.243017395 +0000 UTC m=+0.048411729 container exec_died a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:37:46 compute-0 podman[207307]: 2026-02-26 20:37:46.247233041 +0000 UTC m=+0.123466504 container exec_died a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 26 20:37:46 compute-0 systemd[1]: libpod-conmon-a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0.scope: Deactivated successfully.
Feb 26 20:37:46 compute-0 sudo[207303]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:37:46.499 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:37:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:37:46.501 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:37:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:37:46.501 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:37:46 compute-0 sudo[207488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlftiyqanvvpugyvukotwfjfgfbxyzvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138266.3908727-1148-145034471371994/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:46 compute-0 sudo[207488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:46 compute-0 python3.9[207491]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:46 compute-0 systemd[1]: Started libpod-conmon-a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0.scope.
Feb 26 20:37:46 compute-0 podman[207492]: 2026-02-26 20:37:46.84120942 +0000 UTC m=+0.057459939 container exec a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260223, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 26 20:37:46 compute-0 podman[207511]: 2026-02-26 20:37:46.904049058 +0000 UTC m=+0.049669885 container exec_died a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 26 20:37:46 compute-0 podman[207492]: 2026-02-26 20:37:46.908428308 +0000 UTC m=+0.124678807 container exec_died a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 26 20:37:46 compute-0 systemd[1]: libpod-conmon-a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0.scope: Deactivated successfully.
Feb 26 20:37:46 compute-0 sudo[207488]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:47 compute-0 sudo[207674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpngshlplllshbptocssjwybxgiycbpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138267.077518-1156-220045381077706/AnsiballZ_file.py'
Feb 26 20:37:47 compute-0 sudo[207674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:47 compute-0 python3.9[207677]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:47 compute-0 sudo[207674]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:47 compute-0 sudo[207827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbsdusbygvrsxnpbvnafvyuubtrtgww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138267.6145039-1165-245506912043243/AnsiballZ_podman_container_info.py'
Feb 26 20:37:47 compute-0 sudo[207827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:48 compute-0 python3.9[207830]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 26 20:37:48 compute-0 sudo[207827]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:48 compute-0 sudo[207991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnjtwdtqkoqvltegssvbxmdyqrptcret ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138268.2916713-1173-134182070087659/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:48 compute-0 sudo[207991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:48 compute-0 python3.9[207994]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:48 compute-0 systemd[1]: Started libpod-conmon-e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd.scope.
Feb 26 20:37:48 compute-0 podman[207995]: 2026-02-26 20:37:48.783034498 +0000 UTC m=+0.079099688 container exec e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4)
Feb 26 20:37:48 compute-0 podman[207995]: 2026-02-26 20:37:48.817399638 +0000 UTC m=+0.113464848 container exec_died e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260223, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 26 20:37:48 compute-0 systemd[1]: libpod-conmon-e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd.scope: Deactivated successfully.
Feb 26 20:37:48 compute-0 sudo[207991]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:49 compute-0 sudo[208177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqekdvuydhqkfkdxslnwuhwpaptnwpjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138269.0145051-1181-14200224195075/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:49 compute-0 sudo[208177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:49 compute-0 python3.9[208180]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:49 compute-0 systemd[1]: Started libpod-conmon-e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd.scope.
Feb 26 20:37:49 compute-0 podman[208181]: 2026-02-26 20:37:49.585946692 +0000 UTC m=+0.072207037 container exec e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 26 20:37:49 compute-0 podman[208181]: 2026-02-26 20:37:49.615960952 +0000 UTC m=+0.102221317 container exec_died e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 26 20:37:49 compute-0 sudo[208177]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:49 compute-0 systemd[1]: libpod-conmon-e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd.scope: Deactivated successfully.
Feb 26 20:37:50 compute-0 sudo[208387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqsoygadtvzbgboxlhzzqdlioggrxagb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138269.8626757-1189-251251457701483/AnsiballZ_file.py'
Feb 26 20:37:50 compute-0 sudo[208387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:50 compute-0 podman[208336]: 2026-02-26 20:37:50.156037432 +0000 UTC m=+0.067464206 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 26 20:37:50 compute-0 podman[208337]: 2026-02-26 20:37:50.159904288 +0000 UTC m=+0.065048909 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 26 20:37:50 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Main process exited, code=exited, status=1/FAILURE
Feb 26 20:37:50 compute-0 systemd[1]: e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd-73b6ba56e668ace0.service: Failed with result 'exit-code'.
Feb 26 20:37:50 compute-0 python3.9[208399]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:50 compute-0 sudo[208387]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:50 compute-0 sudo[208551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-begmsgswixdmkqrbdryhznkcuuktfzdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138270.5050724-1198-9701240233208/AnsiballZ_podman_container_info.py'
Feb 26 20:37:50 compute-0 sudo[208551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:50 compute-0 python3.9[208554]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 26 20:37:50 compute-0 sudo[208551]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:51 compute-0 sudo[208728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dglwcadapbzryoukkauyjlmpspydynwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138271.0679502-1206-151959612195587/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:51 compute-0 sudo[208728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:51 compute-0 podman[208691]: 2026-02-26 20:37:51.278379836 +0000 UTC m=+0.047027980 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:37:51 compute-0 python3.9[208734]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:51 compute-0 systemd[1]: Started libpod-conmon-8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33.scope.
Feb 26 20:37:51 compute-0 podman[208746]: 2026-02-26 20:37:51.516131938 +0000 UTC m=+0.082039808 container exec 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 26 20:37:51 compute-0 podman[208766]: 2026-02-26 20:37:51.574985705 +0000 UTC m=+0.050229519 container exec_died 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:37:51 compute-0 podman[208746]: 2026-02-26 20:37:51.580029145 +0000 UTC m=+0.145936995 container exec_died 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:37:51 compute-0 systemd[1]: libpod-conmon-8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33.scope: Deactivated successfully.
Feb 26 20:37:51 compute-0 sudo[208728]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:51 compute-0 sudo[208926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohzpguqgqirhkmvtgvlvrkbrspoxekor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138271.7414155-1214-243451970707805/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:51 compute-0 sudo[208926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:52 compute-0 python3.9[208929]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:52 compute-0 systemd[1]: Started libpod-conmon-8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33.scope.
Feb 26 20:37:52 compute-0 podman[208930]: 2026-02-26 20:37:52.204754384 +0000 UTC m=+0.067976240 container exec 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:37:52 compute-0 podman[208950]: 2026-02-26 20:37:52.262055517 +0000 UTC m=+0.049992092 container exec_died 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 26 20:37:52 compute-0 podman[208930]: 2026-02-26 20:37:52.266184552 +0000 UTC m=+0.129406378 container exec_died 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:37:52 compute-0 systemd[1]: libpod-conmon-8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33.scope: Deactivated successfully.
Feb 26 20:37:52 compute-0 sudo[208926]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:52 compute-0 sudo[209112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzxcrpeeayruyjkjtsmiifziyvpiflw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138272.4229264-1222-14489020730717/AnsiballZ_file.py'
Feb 26 20:37:52 compute-0 sudo[209112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:52 compute-0 python3.9[209115]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:52 compute-0 sudo[209112]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:53 compute-0 sudo[209265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoucekfbeysciekmmkffvinmjriionkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138273.0353591-1231-110171475815463/AnsiballZ_podman_container_info.py'
Feb 26 20:37:53 compute-0 sudo[209265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:53 compute-0 python3.9[209268]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 26 20:37:53 compute-0 sudo[209265]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:53 compute-0 sudo[209431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyuxwixjmouadxegdbutkoyxyrckcpjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138273.650224-1239-38437220122082/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:53 compute-0 sudo[209431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:54 compute-0 python3.9[209434]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:54 compute-0 systemd[1]: Started libpod-conmon-2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157.scope.
Feb 26 20:37:54 compute-0 podman[209435]: 2026-02-26 20:37:54.160998879 +0000 UTC m=+0.082799739 container exec 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:37:54 compute-0 podman[209455]: 2026-02-26 20:37:54.223193349 +0000 UTC m=+0.054254451 container exec_died 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:37:54 compute-0 podman[209435]: 2026-02-26 20:37:54.230350847 +0000 UTC m=+0.152151697 container exec_died 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:37:54 compute-0 systemd[1]: libpod-conmon-2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157.scope: Deactivated successfully.
Feb 26 20:37:54 compute-0 sudo[209431]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:54 compute-0 sudo[209617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woylbjuwgwlycumqbedkdpszdkpztngp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138274.4355953-1247-233362258926743/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:54 compute-0 sudo[209617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:55 compute-0 python3.9[209620]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:55 compute-0 systemd[1]: Started libpod-conmon-2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157.scope.
Feb 26 20:37:55 compute-0 podman[209621]: 2026-02-26 20:37:55.121006967 +0000 UTC m=+0.077619597 container exec 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:37:55 compute-0 podman[209621]: 2026-02-26 20:37:55.155251713 +0000 UTC m=+0.111864253 container exec_died 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 26 20:37:55 compute-0 systemd[1]: libpod-conmon-2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157.scope: Deactivated successfully.
Feb 26 20:37:55 compute-0 sudo[209617]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:55 compute-0 sudo[209799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvltdkwqemhdcfoiswjxxkjqezmtjaft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138275.3682702-1255-122945725640596/AnsiballZ_file.py'
Feb 26 20:37:55 compute-0 sudo[209799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:55 compute-0 python3.9[209802]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:55 compute-0 sudo[209799]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:56 compute-0 sudo[209952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lorwvqkpwmkbpdhuwwzokumezuyyzzde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138276.019534-1264-14746170263031/AnsiballZ_podman_container_info.py'
Feb 26 20:37:56 compute-0 sudo[209952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:56 compute-0 python3.9[209955]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 26 20:37:56 compute-0 sudo[209952]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:56 compute-0 sudo[210118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsnuwillduedywdefdpdurykswcdlafw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138276.6470683-1272-230229984555923/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:56 compute-0 sudo[210118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:57 compute-0 python3.9[210121]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:57 compute-0 systemd[1]: Started libpod-conmon-ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0.scope.
Feb 26 20:37:57 compute-0 podman[210122]: 2026-02-26 20:37:57.139539266 +0000 UTC m=+0.061362998 container exec ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 26 20:37:57 compute-0 podman[210122]: 2026-02-26 20:37:57.169439781 +0000 UTC m=+0.091263463 container exec_died ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, version=9.7, release=1770267347)
Feb 26 20:37:57 compute-0 systemd[1]: libpod-conmon-ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0.scope: Deactivated successfully.
Feb 26 20:37:57 compute-0 sudo[210118]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:57 compute-0 sudo[210303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokngbjzljuheppgmtufyleslmylebtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138277.3823993-1280-171802527980528/AnsiballZ_podman_container_exec.py'
Feb 26 20:37:57 compute-0 sudo[210303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:57 compute-0 python3.9[210306]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 26 20:37:57 compute-0 systemd[1]: Started libpod-conmon-ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0.scope.
Feb 26 20:37:57 compute-0 podman[210307]: 2026-02-26 20:37:57.895102741 +0000 UTC m=+0.071454236 container exec ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 26 20:37:57 compute-0 podman[210307]: 2026-02-26 20:37:57.925891072 +0000 UTC m=+0.102242477 container exec_died ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Feb 26 20:37:57 compute-0 systemd[1]: libpod-conmon-ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0.scope: Deactivated successfully.
Feb 26 20:37:57 compute-0 sudo[210303]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:57 compute-0 podman[210323]: 2026-02-26 20:37:57.98078546 +0000 UTC m=+0.085124735 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:37:58 compute-0 sudo[210515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nupavnmpgkygahwmfdxuotlftlhzprlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138278.111185-1288-188965465580099/AnsiballZ_file.py'
Feb 26 20:37:58 compute-0 sudo[210515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:58 compute-0 python3.9[210518]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:58 compute-0 sudo[210515]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:58 compute-0 sudo[210668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eshjhfvitqlhplizzdgblwcttenerrjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138278.7077994-1297-128029204570994/AnsiballZ_file.py'
Feb 26 20:37:58 compute-0 sudo[210668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:59 compute-0 python3.9[210671]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:37:59 compute-0 sudo[210668]: pam_unix(sudo:session): session closed for user root
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.358 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.359 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.401 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.401 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.402 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.414 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.414 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:37:59 compute-0 nova_compute[186588]: 2026-02-26 20:37:59.415 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:37:59 compute-0 sudo[210821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdmasywgovedeudnzaibksbgcehsgbsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138279.3963299-1305-254165715230760/AnsiballZ_stat.py'
Feb 26 20:37:59 compute-0 sudo[210821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:37:59 compute-0 python3.9[210824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:37:59 compute-0 sudo[210821]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.084 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.084 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.084 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.084 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.216 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.217 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=72.74266815185547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.217 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.218 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:38:00 compute-0 sudo[210945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmkotnhhqxtgkzobguarohyahvjuztf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138279.3963299-1305-254165715230760/AnsiballZ_copy.py'
Feb 26 20:38:00 compute-0 sudo[210945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.295 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.296 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.314 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.326 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.328 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:38:00 compute-0 nova_compute[186588]: 2026-02-26 20:38:00.328 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:38:00 compute-0 python3.9[210948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772138279.3963299-1305-254165715230760/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:00 compute-0 sudo[210945]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:00 compute-0 sudo[211098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwrdghedxmcxaqpcbjaljyypiulijnff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138280.6511302-1321-271885653503572/AnsiballZ_file.py'
Feb 26 20:38:00 compute-0 sudo[211098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:01 compute-0 python3.9[211101]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:01 compute-0 sudo[211098]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:01 compute-0 nova_compute[186588]: 2026-02-26 20:38:01.328 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:38:01 compute-0 nova_compute[186588]: 2026-02-26 20:38:01.328 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:38:01 compute-0 anacron[8667]: Job `cron.daily' started
Feb 26 20:38:01 compute-0 anacron[8667]: Job `cron.daily' terminated
Feb 26 20:38:01 compute-0 sudo[211253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxcvoqyxqadvgklwzjqqvdcliuhuvdqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138281.2738605-1329-184517780438846/AnsiballZ_stat.py'
Feb 26 20:38:01 compute-0 sudo[211253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:01 compute-0 python3.9[211256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:01 compute-0 sudo[211253]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:02 compute-0 sudo[211332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bybnadbrswqwssngudfqvjoeandyrgee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138281.2738605-1329-184517780438846/AnsiballZ_file.py'
Feb 26 20:38:02 compute-0 sudo[211332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:02 compute-0 python3.9[211335]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:02 compute-0 sudo[211332]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:02 compute-0 sudo[211485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufnrtabewuhhubbkdvjgvjfmgxtlbjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138282.4374554-1341-275810492803413/AnsiballZ_stat.py'
Feb 26 20:38:02 compute-0 sudo[211485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:02 compute-0 python3.9[211488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:02 compute-0 sudo[211485]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:03 compute-0 sudo[211564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-robhkyeorgvcgzjmfsyesbhxvyymbuks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138282.4374554-1341-275810492803413/AnsiballZ_file.py'
Feb 26 20:38:03 compute-0 sudo[211564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:03 compute-0 python3.9[211567]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6_ep66fk recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:03 compute-0 sudo[211564]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:03 compute-0 sudo[211717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqoflenejrystrmqrqiogmfydezkdcsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138283.6263201-1353-182556994828867/AnsiballZ_stat.py'
Feb 26 20:38:03 compute-0 sudo[211717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:04 compute-0 python3.9[211720]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:04 compute-0 sudo[211717]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:04 compute-0 sudo[211796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asmkjmdhdnwjaatvduxssunyxpkatcei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138283.6263201-1353-182556994828867/AnsiballZ_file.py'
Feb 26 20:38:04 compute-0 sudo[211796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:05 compute-0 python3.9[211799]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:05 compute-0 sudo[211796]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:05 compute-0 podman[211801]: 2026-02-26 20:38:05.111816481 +0000 UTC m=+0.057248383 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:38:05 compute-0 sudo[211973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlugexydkcsnhlanecfnhxbacbtcynpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138285.2117558-1366-32224751627523/AnsiballZ_command.py'
Feb 26 20:38:05 compute-0 sudo[211973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:05 compute-0 python3.9[211976]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:38:05 compute-0 sudo[211973]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:06 compute-0 sudo[212127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzqngurmzrlakgberbexuvqebynohae ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772138285.7703094-1374-144430913980179/AnsiballZ_edpm_nftables_from_files.py'
Feb 26 20:38:06 compute-0 sudo[212127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:06 compute-0 python3[212130]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 26 20:38:06 compute-0 sudo[212127]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:06 compute-0 sudo[212280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gylhemhkkngcwviobppxhgfhaayexdlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138286.5142086-1382-72688241412899/AnsiballZ_stat.py'
Feb 26 20:38:06 compute-0 sudo[212280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:06 compute-0 python3.9[212283]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:07 compute-0 sudo[212280]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:07 compute-0 sudo[212359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trduxzsvkjowqqzwwfxruxyxoofyjlbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138286.5142086-1382-72688241412899/AnsiballZ_file.py'
Feb 26 20:38:07 compute-0 sudo[212359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:07 compute-0 python3.9[212362]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:07 compute-0 sudo[212359]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:07 compute-0 sudo[212512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnzcnqjulggbhmvrluwajcbvysbykshi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138287.6564772-1394-257658677694694/AnsiballZ_stat.py'
Feb 26 20:38:07 compute-0 sudo[212512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:08 compute-0 python3.9[212515]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:08 compute-0 sudo[212512]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:08 compute-0 sudo[212591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcgqtjmghddutsdazbgvldwizjsccrmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138287.6564772-1394-257658677694694/AnsiballZ_file.py'
Feb 26 20:38:08 compute-0 sudo[212591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:08 compute-0 python3.9[212594]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:08 compute-0 sudo[212591]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:09 compute-0 sudo[212757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhblxbwzvhcjbncowxanxbynasjvihqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138288.8129644-1406-244495160674575/AnsiballZ_stat.py'
Feb 26 20:38:09 compute-0 sudo[212757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:09 compute-0 podman[212718]: 2026-02-26 20:38:09.090783441 +0000 UTC m=+0.052171803 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 26 20:38:09 compute-0 python3.9[212766]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:09 compute-0 sudo[212757]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:09 compute-0 sudo[212844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyfqqxvfilofiuvuazrcbldrxfnrjjhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138288.8129644-1406-244495160674575/AnsiballZ_file.py'
Feb 26 20:38:09 compute-0 sudo[212844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:09 compute-0 python3.9[212847]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:09 compute-0 sudo[212844]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:10 compute-0 sudo[212997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uydgrojfmdpjwqvcaupjowxzskqmyeah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138289.894342-1418-60618412403378/AnsiballZ_stat.py'
Feb 26 20:38:10 compute-0 sudo[212997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:10 compute-0 python3.9[213000]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:10 compute-0 sudo[212997]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:10 compute-0 sudo[213076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsbhphdrnxjphqowlmyyogorjsmohccc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138289.894342-1418-60618412403378/AnsiballZ_file.py'
Feb 26 20:38:10 compute-0 sudo[213076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:10 compute-0 python3.9[213079]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:10 compute-0 sudo[213076]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:11 compute-0 sudo[213229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aucgyycxueewnguzrcboepxvelsvgoxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138291.085851-1430-80910139638131/AnsiballZ_stat.py'
Feb 26 20:38:11 compute-0 sudo[213229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:11 compute-0 python3.9[213232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 26 20:38:11 compute-0 sudo[213229]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:12 compute-0 sudo[213355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crcfashrvlgkpahcydbavtrikzomqeuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138291.085851-1430-80910139638131/AnsiballZ_copy.py'
Feb 26 20:38:12 compute-0 sudo[213355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:12 compute-0 python3.9[213358]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772138291.085851-1430-80910139638131/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:12 compute-0 sudo[213355]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:12 compute-0 sudo[213508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-renxoqfuvvjtkhlchutwcheeyshmfoof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138292.5862412-1445-176512016010518/AnsiballZ_file.py'
Feb 26 20:38:12 compute-0 sudo[213508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:13 compute-0 python3.9[213511]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:13 compute-0 sudo[213508]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:13 compute-0 sudo[213661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muewobzsbtogrhnmhqcpzrpeakceudkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138293.160028-1453-58638976592075/AnsiballZ_command.py'
Feb 26 20:38:13 compute-0 sudo[213661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:13 compute-0 python3.9[213664]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:38:13 compute-0 sudo[213661]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:14 compute-0 sudo[213817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckrdqmjteonjvfthmkhjqoxlplboevix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138293.728624-1461-9806305760474/AnsiballZ_blockinfile.py'
Feb 26 20:38:14 compute-0 sudo[213817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:14 compute-0 python3.9[213820]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:14 compute-0 sudo[213817]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:14 compute-0 sudo[213970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xemgwstbdahmufznektaigtwjgqrdzka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138294.6937854-1470-101507302503757/AnsiballZ_command.py'
Feb 26 20:38:15 compute-0 sudo[213970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:15 compute-0 python3.9[213973]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:38:15 compute-0 sudo[213970]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:15 compute-0 sudo[214124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwhtwyklyixdxphgvntfihtjbyzdbecx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138295.411712-1478-36547477336864/AnsiballZ_stat.py'
Feb 26 20:38:15 compute-0 sudo[214124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:15 compute-0 python3.9[214127]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 26 20:38:15 compute-0 sudo[214124]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:16 compute-0 sudo[214279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsenibfhrqhxsksnzrskgmnfdyiyxmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138296.2003593-1486-100627009633093/AnsiballZ_command.py'
Feb 26 20:38:16 compute-0 sudo[214279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:16 compute-0 python3.9[214282]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 26 20:38:16 compute-0 sudo[214279]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:17 compute-0 sudo[214435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyxblkxjwswgrxgbzjbhxsihssgdoaca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772138296.8814373-1494-35497677721188/AnsiballZ_file.py'
Feb 26 20:38:17 compute-0 sudo[214435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:38:17 compute-0 python3.9[214438]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 26 20:38:17 compute-0 sudo[214435]: pam_unix(sudo:session): session closed for user root
Feb 26 20:38:17 compute-0 sshd-session[186888]: Connection closed by 192.168.122.30 port 58936
Feb 26 20:38:17 compute-0 sshd-session[186885]: pam_unix(sshd:session): session closed for user zuul
Feb 26 20:38:17 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 26 20:38:17 compute-0 systemd[1]: session-25.scope: Consumed 1min 38.615s CPU time.
Feb 26 20:38:17 compute-0 systemd-logind[825]: Session 25 logged out. Waiting for processes to exit.
Feb 26 20:38:17 compute-0 systemd-logind[825]: Removed session 25.
Feb 26 20:38:20 compute-0 podman[214464]: 2026-02-26 20:38:20.534380874 +0000 UTC m=+0.049786389 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 26 20:38:20 compute-0 podman[214465]: 2026-02-26 20:38:20.550721556 +0000 UTC m=+0.061401030 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0)
Feb 26 20:38:21 compute-0 podman[214506]: 2026-02-26 20:38:21.526772707 +0000 UTC m=+0.044211003 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 26 20:38:28 compute-0 podman[214531]: 2026-02-26 20:38:28.574773134 +0000 UTC m=+0.082678858 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:38:29 compute-0 podman[202527]: time="2026-02-26T20:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:38:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:38:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2987 "" "Go-http-client/1.1"
Feb 26 20:38:31 compute-0 openstack_network_exporter[205682]: ERROR   20:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:38:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:38:31 compute-0 openstack_network_exporter[205682]: ERROR   20:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:38:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:38:35 compute-0 podman[214564]: 2026-02-26 20:38:35.552682553 +0000 UTC m=+0.068555497 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:38:39 compute-0 podman[214588]: 2026-02-26 20:38:39.555610515 +0000 UTC m=+0.073651727 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1770267347, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 26 20:38:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:46.500 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:38:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:46.500 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:38:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:46.501 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.069 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.069 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.069 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.070 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.070 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348191f70>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.078 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.079 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:38:51.080 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:38:51 compute-0 podman[214611]: 2026-02-26 20:38:51.562883074 +0000 UTC m=+0.072593784 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 26 20:38:51 compute-0 podman[214610]: 2026-02-26 20:38:51.568166845 +0000 UTC m=+0.080653248 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 26 20:38:51 compute-0 podman[214647]: 2026-02-26 20:38:51.626703854 +0000 UTC m=+0.051633827 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:38:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:56.818 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:38:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:56.820 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:38:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:38:56.822 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:38:59 compute-0 podman[214672]: 2026-02-26 20:38:59.607806503 +0000 UTC m=+0.104053212 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260223, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:38:59 compute-0 podman[202527]: time="2026-02-26T20:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:38:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:38:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2995 "" "Go-http-client/1.1"
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.078 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.079 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.080 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.080 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.080 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.081 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.114 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.115 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.115 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.116 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.346 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.347 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5919MB free_disk=72.77655029296875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.347 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.347 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.432 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.432 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.470 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.491 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.494 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:39:00 compute-0 nova_compute[186588]: 2026-02-26 20:39:00.495 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:39:01 compute-0 openstack_network_exporter[205682]: ERROR   20:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:39:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:39:01 compute-0 openstack_network_exporter[205682]: ERROR   20:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:39:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:39:01 compute-0 nova_compute[186588]: 2026-02-26 20:39:01.476 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:01 compute-0 nova_compute[186588]: 2026-02-26 20:39:01.477 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:02 compute-0 nova_compute[186588]: 2026-02-26 20:39:02.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:03 compute-0 nova_compute[186588]: 2026-02-26 20:39:03.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:06 compute-0 podman[214702]: 2026-02-26 20:39:06.561206429 +0000 UTC m=+0.069836361 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:39:10 compute-0 podman[214726]: 2026-02-26 20:39:10.559121953 +0000 UTC m=+0.066497941 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, release=1770267347, vcs-type=git, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Feb 26 20:39:22 compute-0 podman[214747]: 2026-02-26 20:39:22.549276903 +0000 UTC m=+0.052484419 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:39:22 compute-0 podman[214748]: 2026-02-26 20:39:22.559597098 +0000 UTC m=+0.060758979 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 26 20:39:22 compute-0 podman[214749]: 2026-02-26 20:39:22.560728127 +0000 UTC m=+0.057467910 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 26 20:39:29 compute-0 podman[202527]: time="2026-02-26T20:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:39:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:39:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2993 "" "Go-http-client/1.1"
Feb 26 20:39:30 compute-0 podman[214801]: 2026-02-26 20:39:30.582767816 +0000 UTC m=+0.091481117 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller)
Feb 26 20:39:31 compute-0 openstack_network_exporter[205682]: ERROR   20:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:39:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:39:31 compute-0 openstack_network_exporter[205682]: ERROR   20:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:39:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:39:37 compute-0 podman[214826]: 2026-02-26 20:39:37.536012837 +0000 UTC m=+0.054274917 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:39:41 compute-0 podman[214851]: 2026-02-26 20:39:41.550126143 +0000 UTC m=+0.060342768 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Feb 26 20:39:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:39:46.502 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:39:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:39:46.502 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:39:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:39:46.503 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:39:53 compute-0 podman[214874]: 2026-02-26 20:39:53.548629938 +0000 UTC m=+0.058417237 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 26 20:39:53 compute-0 podman[214873]: 2026-02-26 20:39:53.550064289 +0000 UTC m=+0.060467920 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:39:53 compute-0 podman[214875]: 2026-02-26 20:39:53.555997102 +0000 UTC m=+0.060191930 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223)
Feb 26 20:39:59 compute-0 nova_compute[186588]: 2026-02-26 20:39:59.056 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:39:59 compute-0 podman[202527]: time="2026-02-26T20:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:39:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:39:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.073 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.116 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.117 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.118 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.118 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.268 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.269 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5951MB free_disk=72.77656936645508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.269 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.269 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.326 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.327 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.356 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.387 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.390 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:40:00 compute-0 nova_compute[186588]: 2026-02-26 20:40:00.390 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:40:01 compute-0 nova_compute[186588]: 2026-02-26 20:40:01.376 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:01 compute-0 nova_compute[186588]: 2026-02-26 20:40:01.376 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:01 compute-0 nova_compute[186588]: 2026-02-26 20:40:01.376 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:01 compute-0 nova_compute[186588]: 2026-02-26 20:40:01.377 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:01 compute-0 nova_compute[186588]: 2026-02-26 20:40:01.377 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:40:01 compute-0 openstack_network_exporter[205682]: ERROR   20:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:40:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:40:01 compute-0 openstack_network_exporter[205682]: ERROR   20:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:40:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:40:01 compute-0 podman[214937]: 2026-02-26 20:40:01.576842392 +0000 UTC m=+0.088390463 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:40:04 compute-0 nova_compute[186588]: 2026-02-26 20:40:04.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:04 compute-0 nova_compute[186588]: 2026-02-26 20:40:04.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:08 compute-0 podman[214963]: 2026-02-26 20:40:08.573825401 +0000 UTC m=+0.086680981 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:40:12 compute-0 podman[214985]: 2026-02-26 20:40:12.524543827 +0000 UTC m=+0.042777526 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 26 20:40:24 compute-0 podman[215009]: 2026-02-26 20:40:24.552634716 +0000 UTC m=+0.051079994 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:40:24 compute-0 podman[215008]: 2026-02-26 20:40:24.563577669 +0000 UTC m=+0.067758342 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260223, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:40:24 compute-0 podman[215007]: 2026-02-26 20:40:24.570967504 +0000 UTC m=+0.076974033 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:40:29 compute-0 podman[202527]: time="2026-02-26T20:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:40:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:40:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Feb 26 20:40:31 compute-0 openstack_network_exporter[205682]: ERROR   20:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:40:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:40:31 compute-0 openstack_network_exporter[205682]: ERROR   20:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:40:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:40:32 compute-0 podman[215068]: 2026-02-26 20:40:32.542509378 +0000 UTC m=+0.058316473 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 26 20:40:39 compute-0 podman[215094]: 2026-02-26 20:40:39.538592719 +0000 UTC m=+0.051721873 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:40:43 compute-0 podman[215118]: 2026-02-26 20:40:43.530644433 +0000 UTC m=+0.048380263 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7)
Feb 26 20:40:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:40:46.502 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:40:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:40:46.502 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:40:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:40:46.503 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.069 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.069 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.069 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.070 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.070 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.085 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.085 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.085 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.085 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:40:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:40:55 compute-0 podman[215142]: 2026-02-26 20:40:55.529612055 +0000 UTC m=+0.048015649 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:40:55 compute-0 podman[215144]: 2026-02-26 20:40:55.538674588 +0000 UTC m=+0.052882220 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223)
Feb 26 20:40:55 compute-0 podman[215143]: 2026-02-26 20:40:55.557832202 +0000 UTC m=+0.073618706 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223)
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.077 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.079 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.079 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 26 20:40:59 compute-0 nova_compute[186588]: 2026-02-26 20:40:59.091 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:40:59 compute-0 podman[202527]: time="2026-02-26T20:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:40:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:40:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Feb 26 20:41:00 compute-0 nova_compute[186588]: 2026-02-26 20:41:00.101 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.076 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.076 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.103 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.103 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.104 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.104 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.252 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.253 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5958MB free_disk=72.77826309204102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.253 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.253 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:41:01 compute-0 openstack_network_exporter[205682]: ERROR   20:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:41:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:41:01 compute-0 openstack_network_exporter[205682]: ERROR   20:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:41:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.476 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.477 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.569 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing inventories for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.653 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Updating ProviderTree inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.654 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.667 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing aggregate associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.692 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing trait associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.710 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.727 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.728 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:41:01 compute-0 nova_compute[186588]: 2026-02-26 20:41:01.728 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:41:02 compute-0 nova_compute[186588]: 2026-02-26 20:41:02.712 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:02 compute-0 nova_compute[186588]: 2026-02-26 20:41:02.712 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:02 compute-0 nova_compute[186588]: 2026-02-26 20:41:02.713 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:02 compute-0 nova_compute[186588]: 2026-02-26 20:41:02.713 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:41:03 compute-0 nova_compute[186588]: 2026-02-26 20:41:03.056 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:03 compute-0 podman[215199]: 2026-02-26 20:41:03.612559689 +0000 UTC m=+0.126138014 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:41:05 compute-0 nova_compute[186588]: 2026-02-26 20:41:05.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:05 compute-0 nova_compute[186588]: 2026-02-26 20:41:05.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:10 compute-0 podman[215226]: 2026-02-26 20:41:10.535785157 +0000 UTC m=+0.046234081 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:41:14 compute-0 podman[215250]: 2026-02-26 20:41:14.54876551 +0000 UTC m=+0.065201389 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public)
Feb 26 20:41:26 compute-0 podman[215274]: 2026-02-26 20:41:26.558888718 +0000 UTC m=+0.070377329 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:41:26 compute-0 podman[215275]: 2026-02-26 20:41:26.569113312 +0000 UTC m=+0.075340442 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent)
Feb 26 20:41:26 compute-0 podman[215276]: 2026-02-26 20:41:26.583140958 +0000 UTC m=+0.087515558 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0)
Feb 26 20:41:29 compute-0 podman[202527]: time="2026-02-26T20:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:41:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:41:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Feb 26 20:41:31 compute-0 openstack_network_exporter[205682]: ERROR   20:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:41:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:41:31 compute-0 openstack_network_exporter[205682]: ERROR   20:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:41:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:41:34 compute-0 podman[215336]: 2026-02-26 20:41:34.836890964 +0000 UTC m=+0.064019918 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 26 20:41:41 compute-0 podman[215362]: 2026-02-26 20:41:41.559163114 +0000 UTC m=+0.063821833 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:41:44 compute-0 podman[215387]: 2026-02-26 20:41:44.734566724 +0000 UTC m=+0.054843511 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347)
Feb 26 20:41:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:41:46.504 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:41:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:41:46.505 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:41:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:41:46.505 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:41:57 compute-0 podman[215410]: 2026-02-26 20:41:57.551268572 +0000 UTC m=+0.063194005 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:41:57 compute-0 podman[215412]: 2026-02-26 20:41:57.563060989 +0000 UTC m=+0.064591973 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:41:57 compute-0 podman[215411]: 2026-02-26 20:41:57.575072201 +0000 UTC m=+0.085768242 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 26 20:41:59 compute-0 nova_compute[186588]: 2026-02-26 20:41:59.056 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:41:59 compute-0 podman[202527]: time="2026-02-26T20:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:41:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:41:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Feb 26 20:42:01 compute-0 nova_compute[186588]: 2026-02-26 20:42:01.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:01 compute-0 nova_compute[186588]: 2026-02-26 20:42:01.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:42:01 compute-0 nova_compute[186588]: 2026-02-26 20:42:01.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:42:01 compute-0 nova_compute[186588]: 2026-02-26 20:42:01.074 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:42:01 compute-0 nova_compute[186588]: 2026-02-26 20:42:01.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:01 compute-0 openstack_network_exporter[205682]: ERROR   20:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:42:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:42:01 compute-0 openstack_network_exporter[205682]: ERROR   20:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:42:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:42:02 compute-0 nova_compute[186588]: 2026-02-26 20:42:02.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.087 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.087 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.087 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.088 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.232 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.233 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5971MB free_disk=72.77827072143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.233 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.233 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.297 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.297 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.346 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.361 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.363 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:42:03 compute-0 nova_compute[186588]: 2026-02-26 20:42:03.363 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:42:04 compute-0 nova_compute[186588]: 2026-02-26 20:42:04.357 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:04 compute-0 nova_compute[186588]: 2026-02-26 20:42:04.357 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:05 compute-0 nova_compute[186588]: 2026-02-26 20:42:05.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:05 compute-0 podman[215475]: 2026-02-26 20:42:05.565129466 +0000 UTC m=+0.082488139 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:42:06 compute-0 nova_compute[186588]: 2026-02-26 20:42:06.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:42:12 compute-0 podman[215501]: 2026-02-26 20:42:12.557339019 +0000 UTC m=+0.064934416 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:42:15 compute-0 podman[215526]: 2026-02-26 20:42:15.526739313 +0000 UTC m=+0.045842730 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 26 20:42:28 compute-0 podman[215550]: 2026-02-26 20:42:28.551943559 +0000 UTC m=+0.060157718 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 26 20:42:28 compute-0 podman[215549]: 2026-02-26 20:42:28.559627675 +0000 UTC m=+0.069275592 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 26 20:42:28 compute-0 podman[215548]: 2026-02-26 20:42:28.579648456 +0000 UTC m=+0.090009253 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:42:29 compute-0 podman[202527]: time="2026-02-26T20:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:42:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:42:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Feb 26 20:42:29 compute-0 rsyslogd[1016]: imjournal: 1747 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 26 20:42:31 compute-0 openstack_network_exporter[205682]: ERROR   20:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:42:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:42:31 compute-0 openstack_network_exporter[205682]: ERROR   20:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:42:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:42:36 compute-0 podman[215611]: 2026-02-26 20:42:36.576241306 +0000 UTC m=+0.083929659 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260223, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:42:43 compute-0 podman[215638]: 2026-02-26 20:42:43.55077441 +0000 UTC m=+0.063318782 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:42:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:42:46.505 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:42:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:42:46.505 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:42:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:42:46.505 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:42:46 compute-0 podman[215663]: 2026-02-26 20:42:46.544117851 +0000 UTC m=+0.055002677 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.070 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.070 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.071 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.072 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:42:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:42:59 compute-0 podman[215686]: 2026-02-26 20:42:59.540658821 +0000 UTC m=+0.055082398 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:42:59 compute-0 podman[215687]: 2026-02-26 20:42:59.541780252 +0000 UTC m=+0.053718302 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:42:59 compute-0 podman[215688]: 2026-02-26 20:42:59.571764983 +0000 UTC m=+0.081439762 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Feb 26 20:42:59 compute-0 podman[202527]: time="2026-02-26T20:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:42:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:42:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Feb 26 20:43:01 compute-0 nova_compute[186588]: 2026-02-26 20:43:01.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:01 compute-0 openstack_network_exporter[205682]: ERROR   20:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:43:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:43:01 compute-0 openstack_network_exporter[205682]: ERROR   20:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:43:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.111 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.112 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.131 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.132 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.132 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.132 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.247 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.248 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5966MB free_disk=72.77732467651367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.248 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.248 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.322 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.323 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.342 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.351 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.353 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:43:03 compute-0 nova_compute[186588]: 2026-02-26 20:43:03.353 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:43:04 compute-0 nova_compute[186588]: 2026-02-26 20:43:04.304 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:04 compute-0 nova_compute[186588]: 2026-02-26 20:43:04.305 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:05 compute-0 nova_compute[186588]: 2026-02-26 20:43:05.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:05 compute-0 nova_compute[186588]: 2026-02-26 20:43:05.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:05 compute-0 nova_compute[186588]: 2026-02-26 20:43:05.062 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:43:06 compute-0 nova_compute[186588]: 2026-02-26 20:43:06.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:06 compute-0 nova_compute[186588]: 2026-02-26 20:43:06.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:07 compute-0 podman[215747]: 2026-02-26 20:43:07.573615094 +0000 UTC m=+0.090078809 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 26 20:43:14 compute-0 podman[215773]: 2026-02-26 20:43:14.530361988 +0000 UTC m=+0.037055679 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:43:17 compute-0 podman[215797]: 2026-02-26 20:43:17.521747277 +0000 UTC m=+0.040792920 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7)
Feb 26 20:43:29 compute-0 podman[202527]: time="2026-02-26T20:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:43:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:43:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Feb 26 20:43:30 compute-0 podman[215821]: 2026-02-26 20:43:30.550620598 +0000 UTC m=+0.060515452 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 26 20:43:30 compute-0 podman[215819]: 2026-02-26 20:43:30.556224259 +0000 UTC m=+0.063623696 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:43:30 compute-0 podman[215820]: 2026-02-26 20:43:30.572667653 +0000 UTC m=+0.082400003 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent)
Feb 26 20:43:31 compute-0 openstack_network_exporter[205682]: ERROR   20:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:43:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:43:31 compute-0 openstack_network_exporter[205682]: ERROR   20:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:43:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:43:38 compute-0 podman[215879]: 2026-02-26 20:43:38.54970064 +0000 UTC m=+0.068483707 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:43:44 compute-0 podman[215906]: 2026-02-26 20:43:44.736671004 +0000 UTC m=+0.059536266 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:43:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:43:46.506 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:43:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:43:46.507 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:43:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:43:46.507 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:43:48 compute-0 podman[215931]: 2026-02-26 20:43:48.560108704 +0000 UTC m=+0.070398899 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 26 20:43:59 compute-0 nova_compute[186588]: 2026-02-26 20:43:59.055 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:43:59 compute-0 podman[202527]: time="2026-02-26T20:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:43:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:43:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3001 "" "Go-http-client/1.1"
Feb 26 20:44:01 compute-0 openstack_network_exporter[205682]: ERROR   20:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:44:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:44:01 compute-0 openstack_network_exporter[205682]: ERROR   20:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:44:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:44:01 compute-0 podman[215955]: 2026-02-26 20:44:01.53576426 +0000 UTC m=+0.049492875 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:44:01 compute-0 podman[215956]: 2026-02-26 20:44:01.535806641 +0000 UTC m=+0.051838129 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Feb 26 20:44:01 compute-0 podman[215954]: 2026-02-26 20:44:01.563660062 +0000 UTC m=+0.078926359 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:44:03 compute-0 nova_compute[186588]: 2026-02-26 20:44:03.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:04 compute-0 nova_compute[186588]: 2026-02-26 20:44:04.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:04 compute-0 nova_compute[186588]: 2026-02-26 20:44:04.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:44:04 compute-0 nova_compute[186588]: 2026-02-26 20:44:04.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:44:04 compute-0 nova_compute[186588]: 2026-02-26 20:44:04.086 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.101 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.102 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.103 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.103 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.242 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.244 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5983MB free_disk=72.77734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.244 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.244 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.301 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.302 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.324 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.340 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.342 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:44:05 compute-0 nova_compute[186588]: 2026-02-26 20:44:05.342 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:44:06 compute-0 nova_compute[186588]: 2026-02-26 20:44:06.337 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:06 compute-0 nova_compute[186588]: 2026-02-26 20:44:06.338 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:06 compute-0 nova_compute[186588]: 2026-02-26 20:44:06.338 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:06 compute-0 nova_compute[186588]: 2026-02-26 20:44:06.338 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:44:07 compute-0 nova_compute[186588]: 2026-02-26 20:44:07.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:07 compute-0 nova_compute[186588]: 2026-02-26 20:44:07.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:44:09 compute-0 podman[216017]: 2026-02-26 20:44:09.647114239 +0000 UTC m=+0.148521386 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:44:15 compute-0 podman[216044]: 2026-02-26 20:44:15.573846226 +0000 UTC m=+0.075697247 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:44:19 compute-0 podman[216068]: 2026-02-26 20:44:19.559770965 +0000 UTC m=+0.069020640 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Feb 26 20:44:29 compute-0 podman[202527]: time="2026-02-26T20:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:44:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:44:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Feb 26 20:44:31 compute-0 openstack_network_exporter[205682]: ERROR   20:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:44:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:44:31 compute-0 openstack_network_exporter[205682]: ERROR   20:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:44:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:44:32 compute-0 podman[216092]: 2026-02-26 20:44:32.52946375 +0000 UTC m=+0.043708998 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:44:32 compute-0 podman[216093]: 2026-02-26 20:44:32.53886142 +0000 UTC m=+0.048177407 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 26 20:44:32 compute-0 podman[216094]: 2026-02-26 20:44:32.544604642 +0000 UTC m=+0.052211064 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 26 20:44:40 compute-0 podman[216156]: 2026-02-26 20:44:40.600736443 +0000 UTC m=+0.114336540 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2)
Feb 26 20:44:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:44:46.508 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:44:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:44:46.508 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:44:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:44:46.509 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:44:46 compute-0 podman[216184]: 2026-02-26 20:44:46.529599446 +0000 UTC m=+0.044935722 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:44:50 compute-0 podman[216212]: 2026-02-26 20:44:50.546672839 +0000 UTC m=+0.058595014 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.7, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, name=ubi9/ubi-minimal)
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.070 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.071 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.073 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.086 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.087 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:44:51.088 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:44:59 compute-0 podman[202527]: time="2026-02-26T20:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:44:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:44:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Feb 26 20:45:01 compute-0 openstack_network_exporter[205682]: ERROR   20:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:45:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:45:01 compute-0 openstack_network_exporter[205682]: ERROR   20:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:45:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:45:03 compute-0 nova_compute[186588]: 2026-02-26 20:45:03.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:03 compute-0 podman[216235]: 2026-02-26 20:45:03.541297666 +0000 UTC m=+0.048218508 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:45:03 compute-0 podman[216236]: 2026-02-26 20:45:03.54558667 +0000 UTC m=+0.052535133 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 26 20:45:03 compute-0 podman[216237]: 2026-02-26 20:45:03.546951106 +0000 UTC m=+0.052889951 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:45:04 compute-0 nova_compute[186588]: 2026-02-26 20:45:04.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:04 compute-0 nova_compute[186588]: 2026-02-26 20:45:04.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:45:04 compute-0 nova_compute[186588]: 2026-02-26 20:45:04.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:45:04 compute-0 nova_compute[186588]: 2026-02-26 20:45:04.076 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.079 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.079 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.079 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.079 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.195 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.196 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6005MB free_disk=72.77732467651367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.196 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.196 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.271 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.271 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.293 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.306 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.307 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:45:05 compute-0 nova_compute[186588]: 2026-02-26 20:45:05.307 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:45:06 compute-0 nova_compute[186588]: 2026-02-26 20:45:06.303 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:06 compute-0 nova_compute[186588]: 2026-02-26 20:45:06.303 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:07 compute-0 nova_compute[186588]: 2026-02-26 20:45:07.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:07 compute-0 nova_compute[186588]: 2026-02-26 20:45:07.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:07 compute-0 nova_compute[186588]: 2026-02-26 20:45:07.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:45:08 compute-0 nova_compute[186588]: 2026-02-26 20:45:08.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:45:11 compute-0 podman[216297]: 2026-02-26 20:45:11.562344128 +0000 UTC m=+0.071209938 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 26 20:45:17 compute-0 podman[216323]: 2026-02-26 20:45:17.527216879 +0000 UTC m=+0.044687013 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:45:21 compute-0 podman[216348]: 2026-02-26 20:45:21.561598762 +0000 UTC m=+0.063064654 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9)
Feb 26 20:45:29 compute-0 podman[202527]: time="2026-02-26T20:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:45:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:45:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Feb 26 20:45:31 compute-0 openstack_network_exporter[205682]: ERROR   20:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:45:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:45:31 compute-0 openstack_network_exporter[205682]: ERROR   20:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:45:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:45:34 compute-0 podman[216369]: 2026-02-26 20:45:34.525954945 +0000 UTC m=+0.044123579 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:45:34 compute-0 podman[216371]: 2026-02-26 20:45:34.536791065 +0000 UTC m=+0.046645937 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 26 20:45:34 compute-0 podman[216370]: 2026-02-26 20:45:34.542295202 +0000 UTC m=+0.054330743 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible)
Feb 26 20:45:42 compute-0 podman[216431]: 2026-02-26 20:45:42.566782464 +0000 UTC m=+0.082293196 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260223, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:45:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:45:46.511 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:45:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:45:46.512 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:45:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:45:46.512 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:45:48 compute-0 podman[216457]: 2026-02-26 20:45:48.555668906 +0000 UTC m=+0.058209165 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:45:52 compute-0 podman[216482]: 2026-02-26 20:45:52.533874689 +0000 UTC m=+0.050626332 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 26 20:45:59 compute-0 podman[202527]: time="2026-02-26T20:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:45:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:45:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Feb 26 20:46:00 compute-0 nova_compute[186588]: 2026-02-26 20:46:00.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:00 compute-0 nova_compute[186588]: 2026-02-26 20:46:00.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:00 compute-0 nova_compute[186588]: 2026-02-26 20:46:00.075 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 26 20:46:00 compute-0 nova_compute[186588]: 2026-02-26 20:46:00.090 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 26 20:46:01 compute-0 openstack_network_exporter[205682]: ERROR   20:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:46:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:46:01 compute-0 openstack_network_exporter[205682]: ERROR   20:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:46:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:46:02 compute-0 nova_compute[186588]: 2026-02-26 20:46:02.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:03 compute-0 nova_compute[186588]: 2026-02-26 20:46:03.081 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:05 compute-0 nova_compute[186588]: 2026-02-26 20:46:05.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:05 compute-0 nova_compute[186588]: 2026-02-26 20:46:05.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:46:05 compute-0 nova_compute[186588]: 2026-02-26 20:46:05.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:46:05 compute-0 nova_compute[186588]: 2026-02-26 20:46:05.074 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:46:05 compute-0 podman[216503]: 2026-02-26 20:46:05.544230012 +0000 UTC m=+0.051987908 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:46:05 compute-0 podman[216505]: 2026-02-26 20:46:05.553703505 +0000 UTC m=+0.055336758 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 26 20:46:05 compute-0 podman[216504]: 2026-02-26 20:46:05.57489529 +0000 UTC m=+0.078734912 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 26 20:46:05 compute-0 nova_compute[186588]: 2026-02-26 20:46:05.873 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.079 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.080 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.080 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.080 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.219 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.220 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6000MB free_disk=72.7776107788086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.220 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.220 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.472 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.472 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.598 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing inventories for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.668 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Updating ProviderTree inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.668 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.682 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing aggregate associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.700 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Refreshing trait associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.720 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.732 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.733 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:46:06 compute-0 nova_compute[186588]: 2026-02-26 20:46:06.733 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:46:07 compute-0 nova_compute[186588]: 2026-02-26 20:46:07.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:07 compute-0 nova_compute[186588]: 2026-02-26 20:46:07.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:07 compute-0 nova_compute[186588]: 2026-02-26 20:46:07.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:07 compute-0 nova_compute[186588]: 2026-02-26 20:46:07.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 26 20:46:08 compute-0 nova_compute[186588]: 2026-02-26 20:46:08.071 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:08 compute-0 nova_compute[186588]: 2026-02-26 20:46:08.072 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:08 compute-0 nova_compute[186588]: 2026-02-26 20:46:08.073 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:46:09 compute-0 nova_compute[186588]: 2026-02-26 20:46:09.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:46:13 compute-0 podman[216567]: 2026-02-26 20:46:13.598232094 +0000 UTC m=+0.110562951 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 26 20:46:19 compute-0 podman[216593]: 2026-02-26 20:46:19.536581476 +0000 UTC m=+0.053344184 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:46:23 compute-0 podman[216618]: 2026-02-26 20:46:23.536819187 +0000 UTC m=+0.050574030 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git)
Feb 26 20:46:29 compute-0 podman[202527]: time="2026-02-26T20:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:46:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:46:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Feb 26 20:46:31 compute-0 openstack_network_exporter[205682]: ERROR   20:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:46:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:46:31 compute-0 openstack_network_exporter[205682]: ERROR   20:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:46:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:46:36 compute-0 podman[216641]: 2026-02-26 20:46:36.529391465 +0000 UTC m=+0.042807023 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:46:36 compute-0 podman[216640]: 2026-02-26 20:46:36.529606951 +0000 UTC m=+0.046205583 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:46:36 compute-0 podman[216642]: 2026-02-26 20:46:36.564702578 +0000 UTC m=+0.076285326 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:46:44 compute-0 podman[216700]: 2026-02-26 20:46:44.593721471 +0000 UTC m=+0.094208485 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 26 20:46:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:46:46.512 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:46:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:46:46.513 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:46:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:46:46.513 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:46:50 compute-0 podman[216728]: 2026-02-26 20:46:50.542703148 +0000 UTC m=+0.055676987 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.071 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.071 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.071 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.071 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f1348ac9d60>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:46:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:46:54 compute-0 podman[216753]: 2026-02-26 20:46:54.540970487 +0000 UTC m=+0.054424804 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 26 20:46:59 compute-0 podman[202527]: time="2026-02-26T20:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:46:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:46:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Feb 26 20:47:01 compute-0 openstack_network_exporter[205682]: ERROR   20:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:47:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:47:01 compute-0 openstack_network_exporter[205682]: ERROR   20:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:47:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:47:04 compute-0 nova_compute[186588]: 2026-02-26 20:47:04.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:06 compute-0 nova_compute[186588]: 2026-02-26 20:47:06.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:06 compute-0 nova_compute[186588]: 2026-02-26 20:47:06.058 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:06 compute-0 nova_compute[186588]: 2026-02-26 20:47:06.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:47:06 compute-0 nova_compute[186588]: 2026-02-26 20:47:06.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:47:06 compute-0 nova_compute[186588]: 2026-02-26 20:47:06.076 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:47:07 compute-0 nova_compute[186588]: 2026-02-26 20:47:07.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:07 compute-0 podman[216774]: 2026-02-26 20:47:07.525532162 +0000 UTC m=+0.045379862 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:47:07 compute-0 podman[216775]: 2026-02-26 20:47:07.530208956 +0000 UTC m=+0.045470964 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:47:07 compute-0 podman[216776]: 2026-02-26 20:47:07.557735581 +0000 UTC m=+0.065779426 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0)
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.085 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.086 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.086 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.086 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.222 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.223 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5994MB free_disk=72.77762985229492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.224 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.224 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.279 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.279 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.313 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.327 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.329 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:47:08 compute-0 nova_compute[186588]: 2026-02-26 20:47:08.329 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:47:09 compute-0 nova_compute[186588]: 2026-02-26 20:47:09.330 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:10 compute-0 nova_compute[186588]: 2026-02-26 20:47:10.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:47:10 compute-0 nova_compute[186588]: 2026-02-26 20:47:10.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:47:14 compute-0 podman[216835]: 2026-02-26 20:47:14.739665754 +0000 UTC m=+0.062636692 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:47:21 compute-0 podman[216863]: 2026-02-26 20:47:21.555823568 +0000 UTC m=+0.067576684 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:47:25 compute-0 podman[216889]: 2026-02-26 20:47:25.523085989 +0000 UTC m=+0.039822523 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 26 20:47:29 compute-0 podman[202527]: time="2026-02-26T20:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:47:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:47:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Feb 26 20:47:31 compute-0 openstack_network_exporter[205682]: ERROR   20:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:47:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:47:31 compute-0 openstack_network_exporter[205682]: ERROR   20:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:47:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:47:38 compute-0 podman[216912]: 2026-02-26 20:47:38.546610943 +0000 UTC m=+0.060214097 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 26 20:47:38 compute-0 podman[216913]: 2026-02-26 20:47:38.552569692 +0000 UTC m=+0.063130295 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 26 20:47:38 compute-0 podman[216911]: 2026-02-26 20:47:38.569617086 +0000 UTC m=+0.087967037 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:47:45 compute-0 podman[216972]: 2026-02-26 20:47:45.582619533 +0000 UTC m=+0.102273041 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:47:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:47:46.514 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:47:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:47:46.514 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:47:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:47:46.514 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:47:52 compute-0 podman[216999]: 2026-02-26 20:47:52.582265122 +0000 UTC m=+0.091183884 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:47:56 compute-0 podman[217023]: 2026-02-26 20:47:56.619959233 +0000 UTC m=+0.107546551 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 26 20:47:59 compute-0 podman[202527]: time="2026-02-26T20:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:47:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:47:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2999 "" "Go-http-client/1.1"
Feb 26 20:48:01 compute-0 openstack_network_exporter[205682]: ERROR   20:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:48:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:48:01 compute-0 openstack_network_exporter[205682]: ERROR   20:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:48:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:48:03 compute-0 nova_compute[186588]: 2026-02-26 20:48:03.055 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:06 compute-0 nova_compute[186588]: 2026-02-26 20:48:06.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:06 compute-0 nova_compute[186588]: 2026-02-26 20:48:06.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:48:06 compute-0 nova_compute[186588]: 2026-02-26 20:48:06.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:48:06 compute-0 nova_compute[186588]: 2026-02-26 20:48:06.074 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:48:06 compute-0 nova_compute[186588]: 2026-02-26 20:48:06.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:07 compute-0 nova_compute[186588]: 2026-02-26 20:48:07.070 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:08 compute-0 nova_compute[186588]: 2026-02-26 20:48:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.058 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.089 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.089 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.090 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.090 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.232 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.234 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5999MB free_disk=72.77764892578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.235 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.235 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.290 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.290 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.310 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.321 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.323 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:48:09 compute-0 nova_compute[186588]: 2026-02-26 20:48:09.323 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:48:09 compute-0 podman[217046]: 2026-02-26 20:48:09.560026882 +0000 UTC m=+0.067817660 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 26 20:48:09 compute-0 podman[217044]: 2026-02-26 20:48:09.570723557 +0000 UTC m=+0.077925860 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:48:09 compute-0 podman[217045]: 2026-02-26 20:48:09.573762588 +0000 UTC m=+0.086360085 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:48:10 compute-0 nova_compute[186588]: 2026-02-26 20:48:10.323 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:12 compute-0 nova_compute[186588]: 2026-02-26 20:48:12.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:48:12 compute-0 nova_compute[186588]: 2026-02-26 20:48:12.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:48:16 compute-0 podman[217104]: 2026-02-26 20:48:16.56523576 +0000 UTC m=+0.073911104 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:48:23 compute-0 podman[217131]: 2026-02-26 20:48:23.555880608 +0000 UTC m=+0.068878913 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:48:27 compute-0 podman[217155]: 2026-02-26 20:48:27.551372336 +0000 UTC m=+0.063916648 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal)
Feb 26 20:48:29 compute-0 podman[202527]: time="2026-02-26T20:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:48:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:48:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Feb 26 20:48:31 compute-0 openstack_network_exporter[205682]: ERROR   20:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:48:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:48:31 compute-0 openstack_network_exporter[205682]: ERROR   20:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:48:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:48:40 compute-0 podman[217177]: 2026-02-26 20:48:40.522454311 +0000 UTC m=+0.039152079 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:48:40 compute-0 podman[217179]: 2026-02-26 20:48:40.551737772 +0000 UTC m=+0.056292232 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Feb 26 20:48:40 compute-0 podman[217178]: 2026-02-26 20:48:40.568624548 +0000 UTC m=+0.082829299 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 26 20:48:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:48:46.515 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:48:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:48:46.516 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:48:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:48:46.517 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:48:47 compute-0 podman[217237]: 2026-02-26 20:48:47.58994717 +0000 UTC m=+0.107079224 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.071 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.071 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.072 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.074 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.074 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.079 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.080 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a020260>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': [], 'disk.root.size': [], 'network.outgoing.bytes.delta': [], 'disk.device.allocation': [], 'cpu': [], 'network.incoming.packets.drop': [], 'network.incoming.packets': [], 'memory.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.082 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.083 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:48:51.085 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:48:54 compute-0 podman[217265]: 2026-02-26 20:48:54.548131194 +0000 UTC m=+0.066317992 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:48:58 compute-0 podman[217289]: 2026-02-26 20:48:58.528444472 +0000 UTC m=+0.042274403 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Feb 26 20:48:59 compute-0 podman[202527]: time="2026-02-26T20:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:48:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:48:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Feb 26 20:49:01 compute-0 openstack_network_exporter[205682]: ERROR   20:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:49:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:49:01 compute-0 openstack_network_exporter[205682]: ERROR   20:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:49:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:49:07 compute-0 nova_compute[186588]: 2026-02-26 20:49:07.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:07 compute-0 nova_compute[186588]: 2026-02-26 20:49:07.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:07 compute-0 nova_compute[186588]: 2026-02-26 20:49:07.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:49:07 compute-0 nova_compute[186588]: 2026-02-26 20:49:07.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:49:07 compute-0 nova_compute[186588]: 2026-02-26 20:49:07.074 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:49:08 compute-0 nova_compute[186588]: 2026-02-26 20:49:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:09 compute-0 nova_compute[186588]: 2026-02-26 20:49:09.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:09 compute-0 nova_compute[186588]: 2026-02-26 20:49:09.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.088 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.088 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.088 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.089 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.208 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.209 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5989MB free_disk=72.77762985229492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.210 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.210 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.277 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.277 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.299 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.313 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.314 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:49:10 compute-0 nova_compute[186588]: 2026-02-26 20:49:10.315 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:49:11 compute-0 podman[217311]: 2026-02-26 20:49:11.539750764 +0000 UTC m=+0.049668023 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 26 20:49:11 compute-0 podman[217312]: 2026-02-26 20:49:11.540120414 +0000 UTC m=+0.048448901 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, tcib_managed=true)
Feb 26 20:49:11 compute-0 podman[217310]: 2026-02-26 20:49:11.558584952 +0000 UTC m=+0.067621567 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:49:12 compute-0 nova_compute[186588]: 2026-02-26 20:49:12.314 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:13 compute-0 nova_compute[186588]: 2026-02-26 20:49:13.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:49:13 compute-0 nova_compute[186588]: 2026-02-26 20:49:13.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:49:18 compute-0 podman[217373]: 2026-02-26 20:49:18.569177403 +0000 UTC m=+0.081464171 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 26 20:49:25 compute-0 podman[217399]: 2026-02-26 20:49:25.536464255 +0000 UTC m=+0.045336076 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:49:29 compute-0 podman[217423]: 2026-02-26 20:49:29.52407948 +0000 UTC m=+0.043515056 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Feb 26 20:49:29 compute-0 podman[202527]: time="2026-02-26T20:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:49:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:49:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Feb 26 20:49:31 compute-0 openstack_network_exporter[205682]: ERROR   20:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:49:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:49:31 compute-0 openstack_network_exporter[205682]: ERROR   20:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:49:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:49:42 compute-0 podman[217445]: 2026-02-26 20:49:42.544859033 +0000 UTC m=+0.059839517 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:49:42 compute-0 podman[217446]: 2026-02-26 20:49:42.570093646 +0000 UTC m=+0.074815636 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 26 20:49:42 compute-0 podman[217447]: 2026-02-26 20:49:42.594905978 +0000 UTC m=+0.098421186 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2)
Feb 26 20:49:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:49:46.520 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:49:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:49:46.520 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:49:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:49:46.520 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:49:47 compute-0 sshd-session[217508]: Connection closed by 138.197.173.229 port 55448
Feb 26 20:49:49 compute-0 podman[217509]: 2026-02-26 20:49:49.583419157 +0000 UTC m=+0.100781640 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:49:56 compute-0 podman[217536]: 2026-02-26 20:49:56.567863885 +0000 UTC m=+0.075879765 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:49:59 compute-0 podman[202527]: time="2026-02-26T20:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:49:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:49:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Feb 26 20:50:00 compute-0 podman[217561]: 2026-02-26 20:50:00.526594851 +0000 UTC m=+0.044959100 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 26 20:50:01 compute-0 openstack_network_exporter[205682]: ERROR   20:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:50:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:50:01 compute-0 openstack_network_exporter[205682]: ERROR   20:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:50:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:50:04 compute-0 nova_compute[186588]: 2026-02-26 20:50:04.056 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:08.003 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:08.004 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:50:08 compute-0 nova_compute[186588]: 2026-02-26 20:50:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:08 compute-0 nova_compute[186588]: 2026-02-26 20:50:08.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:50:08 compute-0 nova_compute[186588]: 2026-02-26 20:50:08.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:50:08 compute-0 nova_compute[186588]: 2026-02-26 20:50:08.075 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:50:08 compute-0 nova_compute[186588]: 2026-02-26 20:50:08.075 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:09 compute-0 nova_compute[186588]: 2026-02-26 20:50:09.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:09 compute-0 nova_compute[186588]: 2026-02-26 20:50:09.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:10 compute-0 nova_compute[186588]: 2026-02-26 20:50:10.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:10 compute-0 nova_compute[186588]: 2026-02-26 20:50:10.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:10 compute-0 sshd-session[217582]: Connection closed by authenticating user root 138.197.173.229 port 58500 [preauth]
Feb 26 20:50:11 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:11.006 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.088 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.089 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.089 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.089 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.205 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.205 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6001MB free_disk=72.77762985229492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.206 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.206 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.304 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.305 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.347 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.361 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.362 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:50:12 compute-0 nova_compute[186588]: 2026-02-26 20:50:12.362 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:13 compute-0 podman[217586]: 2026-02-26 20:50:13.575673631 +0000 UTC m=+0.072009292 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4)
Feb 26 20:50:13 compute-0 podman[217585]: 2026-02-26 20:50:13.598723196 +0000 UTC m=+0.095071317 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:50:13 compute-0 podman[217584]: 2026-02-26 20:50:13.598901761 +0000 UTC m=+0.099217299 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:50:15 compute-0 nova_compute[186588]: 2026-02-26 20:50:15.361 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:50:15 compute-0 nova_compute[186588]: 2026-02-26 20:50:15.362 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:50:20 compute-0 podman[217646]: 2026-02-26 20:50:20.592578796 +0000 UTC m=+0.105515695 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.193 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.194 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.238 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.354 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.356 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.364 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.365 186592 INFO nova.compute.claims [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.482 186592 DEBUG nova.compute.provider_tree [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.498 186592 DEBUG nova.scheduler.client.report [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.522 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.523 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:50:27 compute-0 podman[217673]: 2026-02-26 20:50:27.554820973 +0000 UTC m=+0.066414022 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.572 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.572 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.593 186592 INFO nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.616 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.703 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.705 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.705 186592 INFO nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Creating image(s)
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.706 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.706 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.706 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.707 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:27 compute-0 nova_compute[186588]: 2026-02-26 20:50:27.707 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:28 compute-0 nova_compute[186588]: 2026-02-26 20:50:28.408 186592 WARNING oslo_policy.policy [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 26 20:50:28 compute-0 nova_compute[186588]: 2026-02-26 20:50:28.408 186592 WARNING oslo_policy.policy [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 26 20:50:28 compute-0 nova_compute[186588]: 2026-02-26 20:50:28.411 186592 DEBUG nova.policy [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'abebe541add240948f705a0b2859615f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '779290b5e1b1404b9197ae3c548b298e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:50:29 compute-0 nova_compute[186588]: 2026-02-26 20:50:29.649 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:29 compute-0 nova_compute[186588]: 2026-02-26 20:50:29.697 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.part --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:29 compute-0 nova_compute[186588]: 2026-02-26 20:50:29.699 186592 DEBUG nova.virt.images [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] b79c8674-3f8a-4529-8bd8-8464687ab831 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 26 20:50:29 compute-0 nova_compute[186588]: 2026-02-26 20:50:29.710 186592 DEBUG nova.privsep.utils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 26 20:50:29 compute-0 nova_compute[186588]: 2026-02-26 20:50:29.710 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.part /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:29 compute-0 podman[202527]: time="2026-02-26T20:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:50:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:50:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.199 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.part /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.converted" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.202 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.243 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a.converted --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.244 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.257 186592 INFO oslo.privsep.daemon [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp7sq60dma/privsep.sock']
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.489 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.490 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.514 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.607 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.607 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.614 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.614 186592 INFO nova.compute.claims [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.748 186592 DEBUG nova.compute.provider_tree [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.846 186592 INFO oslo.privsep.daemon [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Spawned new privsep daemon via rootwrap
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.739 217717 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.742 217717 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.744 217717 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.744 217717 INFO oslo.privsep.daemon [-] privsep daemon running as pid 217717
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.897 186592 ERROR nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [req-60bda578-a34e-4c50-a2cd-b12edd201b3f] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 895ba9a7-707f-4e79-9130-ec9b9afa47ee.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-60bda578-a34e-4c50-a2cd-b12edd201b3f"}]}
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.911 186592 DEBUG nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Refreshing inventories for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.917 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.926 186592 DEBUG nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating ProviderTree inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.926 186592 DEBUG nova.compute.provider_tree [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.947 186592 DEBUG nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Refreshing aggregate associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.957 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.957 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.958 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.968 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:30 compute-0 nova_compute[186588]: 2026-02-26 20:50:30.978 186592 DEBUG nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Refreshing trait associations for resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.010 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.011 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.024 186592 DEBUG nova.compute.provider_tree [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.043 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.043 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.044 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.090 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.090 186592 DEBUG nova.virt.disk.api [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Checking if we can resize image /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.090 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.100 186592 DEBUG nova.scheduler.client.report [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updated inventory for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.101 186592 DEBUG nova.compute.provider_tree [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating resource provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.101 186592 DEBUG nova.compute.provider_tree [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Updating inventory in ProviderTree for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.127 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.128 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.132 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.132 186592 DEBUG nova.virt.disk.api [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Cannot resize image /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.132 186592 DEBUG nova.objects.instance [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lazy-loading 'migration_context' on Instance uuid b3fa6df3-0cc8-44f5-b1fd-b96469990594 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.195 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.195 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.230 186592 INFO nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.232 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.233 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Ensure instance console log exists: /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.233 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.233 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.234 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.264 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.271 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Successfully created port: 1e16d98a-902e-4ff9-ba99-475b6eeba3de _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:50:31 compute-0 openstack_network_exporter[205682]: ERROR   20:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:50:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:50:31 compute-0 openstack_network_exporter[205682]: ERROR   20:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:50:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:50:31 compute-0 podman[217734]: 2026-02-26 20:50:31.556659699 +0000 UTC m=+0.052233525 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter)
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.572 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.573 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.574 186592 INFO nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Creating image(s)
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.575 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.575 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.576 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.592 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.654 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.655 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.656 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.665 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.705 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.706 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.729 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.730 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.731 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.780 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.781 186592 DEBUG nova.virt.disk.api [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Checking if we can resize image /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.781 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.821 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.822 186592 DEBUG nova.virt.disk.api [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Cannot resize image /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.822 186592 DEBUG nova.objects.instance [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lazy-loading 'migration_context' on Instance uuid 1164f692-eae8-4d3b-8453-9843d5ae0619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.837 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.837 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Ensure instance console log exists: /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.837 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.838 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.838 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:31 compute-0 nova_compute[186588]: 2026-02-26 20:50:31.896 186592 DEBUG nova.policy [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f7e7830e29f34940834b0dc390272550', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa50a5195a5249a1aed0159b8d734e3e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.302 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Successfully created port: f4925885-9f0a-48b5-be05-d81d7ba1d6e0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.559 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Successfully updated port: 1e16d98a-902e-4ff9-ba99-475b6eeba3de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.574 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.575 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquired lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.575 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:50:33 compute-0 nova_compute[186588]: 2026-02-26 20:50:33.878 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.523 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Successfully updated port: f4925885-9f0a-48b5-be05-d81d7ba1d6e0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.540 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.541 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquired lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.541 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.560 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.561 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.605 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.669 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.669 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.780 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.894 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:50:34 compute-0 nova_compute[186588]: 2026-02-26 20:50:34.895 186592 INFO nova.compute.claims [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.076 186592 DEBUG nova.compute.provider_tree [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.103 186592 DEBUG nova.scheduler.client.report [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.130 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.460s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.131 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.134 186592 DEBUG nova.compute.manager [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-changed-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.134 186592 DEBUG nova.compute.manager [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Refreshing instance network info cache due to event network-changed-1e16d98a-902e-4ff9-ba99-475b6eeba3de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.135 186592 DEBUG oslo_concurrency.lockutils [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.186 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.186 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.208 186592 INFO nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.224 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.300 186592 DEBUG nova.network.neutron [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updating instance_info_cache with network_info: [{"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.335 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Releasing lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.336 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Instance network_info: |[{"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.336 186592 DEBUG oslo_concurrency.lockutils [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.337 186592 DEBUG nova.network.neutron [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Refreshing network info cache for port 1e16d98a-902e-4ff9-ba99-475b6eeba3de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.339 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Start _get_guest_xml network_info=[{"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.341 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.343 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.343 186592 INFO nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Creating image(s)
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.344 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.344 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.344 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.359 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.381 186592 WARNING nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.389 186592 DEBUG nova.virt.libvirt.host [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.390 186592 DEBUG nova.virt.libvirt.host [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.395 186592 DEBUG nova.virt.libvirt.host [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.395 186592 DEBUG nova.virt.libvirt.host [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.396 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.396 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.396 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.397 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.397 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.397 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.397 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.397 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.398 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.398 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.398 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.398 186592 DEBUG nova.virt.hardware [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.402 186592 DEBUG nova.privsep.utils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.403 186592 DEBUG nova.virt.libvirt.vif [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-314494616',display_name='tempest-ServersTestManualDisk-server-314494616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-314494616',id=1,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwuZxPMuil7IZ7EvI9UXcF17brGEKItpyjEi6I8YUi0iy7BqrIszwhh1zSUkPSNnRpRKMcui1WDOak9RHzm7x3t3YfIgHDz4rlxymBHHZnEaV5eU+i0/UUKWlvFXG8mig==',key_name='tempest-keypair-632283486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='779290b5e1b1404b9197ae3c548b298e',ramdisk_id='',reservation_id='r-4jfdqrgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1930113032',owner_user_name='tempest-ServersTestManualDisk-1930113032-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='abebe541add240948f705a0b2859615f',uuid=b3fa6df3-0cc8-44f5-b1fd-b96469990594,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.403 186592 DEBUG nova.network.os_vif_util [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converting VIF {"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.404 186592 DEBUG nova.network.os_vif_util [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.406 186592 DEBUG nova.objects.instance [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lazy-loading 'pci_devices' on Instance uuid b3fa6df3-0cc8-44f5-b1fd-b96469990594 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.429 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <uuid>b3fa6df3-0cc8-44f5-b1fd-b96469990594</uuid>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <name>instance-00000001</name>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:name>tempest-ServersTestManualDisk-server-314494616</nova:name>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:50:35</nova:creationTime>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:user uuid="abebe541add240948f705a0b2859615f">tempest-ServersTestManualDisk-1930113032-project-member</nova:user>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:project uuid="779290b5e1b1404b9197ae3c548b298e">tempest-ServersTestManualDisk-1930113032</nova:project>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         <nova:port uuid="1e16d98a-902e-4ff9-ba99-475b6eeba3de">
Feb 26 20:50:35 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <system>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="serial">b3fa6df3-0cc8-44f5-b1fd-b96469990594</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="uuid">b3fa6df3-0cc8-44f5-b1fd-b96469990594</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </system>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <os>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </os>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <features>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </features>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.config"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:3f:72:b9"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <target dev="tap1e16d98a-90"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/console.log" append="off"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <video>
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </video>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:50:35 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:50:35 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:50:35 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:50:35 compute-0 nova_compute[186588]: </domain>
Feb 26 20:50:35 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.430 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Preparing to wait for external event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.430 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.431 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.431 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.431 186592 DEBUG nova.virt.libvirt.vif [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-314494616',display_name='tempest-ServersTestManualDisk-server-314494616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-314494616',id=1,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwuZxPMuil7IZ7EvI9UXcF17brGEKItpyjEi6I8YUi0iy7BqrIszwhh1zSUkPSNnRpRKMcui1WDOak9RHzm7x3t3YfIgHDz4rlxymBHHZnEaV5eU+i0/UUKWlvFXG8mig==',key_name='tempest-keypair-632283486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='779290b5e1b1404b9197ae3c548b298e',ramdisk_id='',reservation_id='r-4jfdqrgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1930113032',owner_user_name='tempest-ServersTestManualDisk-1930113032-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='abebe541add240948f705a0b2859615f',uuid=b3fa6df3-0cc8-44f5-b1fd-b96469990594,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.432 186592 DEBUG nova.network.os_vif_util [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converting VIF {"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.432 186592 DEBUG nova.network.os_vif_util [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.432 186592 DEBUG os_vif [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.457 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.458 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.458 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.468 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.488 186592 DEBUG ovsdbapp.backend.ovs_idl [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.488 186592 DEBUG ovsdbapp.backend.ovs_idl [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.488 186592 DEBUG ovsdbapp.backend.ovs_idl [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.489 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.490 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [POLLOUT] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.490 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.490 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.491 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.493 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.501 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.501 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.501 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.502 186592 INFO oslo.privsep.daemon [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpw5oqhaon/privsep.sock']
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.543 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.544 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.560 186592 DEBUG nova.policy [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '683dc1563e22496ba81bf3253756023f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93f63acb614a4c41813a655e2176374f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.574 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.574 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.574 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.619 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.620 186592 DEBUG nova.virt.disk.api [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Checking if we can resize image /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.620 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.664 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.664 186592 DEBUG nova.virt.disk.api [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Cannot resize image /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.665 186592 DEBUG nova.objects.instance [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'migration_context' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.686 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.686 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Ensure instance console log exists: /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.687 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.687 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:35 compute-0 nova_compute[186588]: 2026-02-26 20:50:35.687 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.203 186592 INFO oslo.privsep.daemon [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Spawned new privsep daemon via rootwrap
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.104 217789 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.111 217789 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.115 217789 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.115 217789 INFO oslo.privsep.daemon [-] privsep daemon running as pid 217789
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.346 186592 DEBUG nova.network.neutron [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Updating instance_info_cache with network_info: [{"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.370 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Releasing lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.371 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Instance network_info: |[{"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.373 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Start _get_guest_xml network_info=[{"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.379 186592 WARNING nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.383 186592 DEBUG nova.virt.libvirt.host [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.384 186592 DEBUG nova.virt.libvirt.host [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.388 186592 DEBUG nova.virt.libvirt.host [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.388 186592 DEBUG nova.virt.libvirt.host [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.389 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.390 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.390 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.391 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.391 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.391 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.392 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.392 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.392 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.392 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.393 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.393 186592 DEBUG nova.virt.hardware [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.396 186592 DEBUG nova.virt.libvirt.vif [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1796184787',display_name='tempest-ServerAddressesTestJSON-server-1796184787',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1796184787',id=2,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa50a5195a5249a1aed0159b8d734e3e',ramdisk_id='',reservation_id='r-3fo1m4fg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1304881296',owner_user_name='tempest-ServerAddressesTestJSON-1304881296-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:31Z,user_data=None,user_id='f7e7830e29f34940834b0dc390272550',uuid=1164f692-eae8-4d3b-8453-9843d5ae0619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.397 186592 DEBUG nova.network.os_vif_util [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converting VIF {"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.397 186592 DEBUG nova.network.os_vif_util [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.398 186592 DEBUG nova.objects.instance [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lazy-loading 'pci_devices' on Instance uuid 1164f692-eae8-4d3b-8453-9843d5ae0619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.413 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <uuid>1164f692-eae8-4d3b-8453-9843d5ae0619</uuid>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <name>instance-00000002</name>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:name>tempest-ServerAddressesTestJSON-server-1796184787</nova:name>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:50:36</nova:creationTime>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:user uuid="f7e7830e29f34940834b0dc390272550">tempest-ServerAddressesTestJSON-1304881296-project-member</nova:user>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:project uuid="fa50a5195a5249a1aed0159b8d734e3e">tempest-ServerAddressesTestJSON-1304881296</nova:project>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         <nova:port uuid="f4925885-9f0a-48b5-be05-d81d7ba1d6e0">
Feb 26 20:50:36 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <system>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="serial">1164f692-eae8-4d3b-8453-9843d5ae0619</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="uuid">1164f692-eae8-4d3b-8453-9843d5ae0619</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </system>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <os>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </os>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <features>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </features>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.config"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:fd:fc:f7"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <target dev="tapf4925885-9f"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/console.log" append="off"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <video>
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </video>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:50:36 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:50:36 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:50:36 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:50:36 compute-0 nova_compute[186588]: </domain>
Feb 26 20:50:36 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.415 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Preparing to wait for external event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.416 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.416 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.416 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.418 186592 DEBUG nova.virt.libvirt.vif [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1796184787',display_name='tempest-ServerAddressesTestJSON-server-1796184787',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1796184787',id=2,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fa50a5195a5249a1aed0159b8d734e3e',ramdisk_id='',reservation_id='r-3fo1m4fg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1304881296',owner_user_name='tempest-ServerAddressesTestJSON-1304881296-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:31Z,user_data=None,user_id='f7e7830e29f34940834b0dc390272550',uuid=1164f692-eae8-4d3b-8453-9843d5ae0619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.418 186592 DEBUG nova.network.os_vif_util [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converting VIF {"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.419 186592 DEBUG nova.network.os_vif_util [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.420 186592 DEBUG os_vif [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.421 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.422 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.422 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.547 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.548 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e16d98a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.549 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e16d98a-90, col_values=(('external_ids', {'iface-id': '1e16d98a-902e-4ff9-ba99-475b6eeba3de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:72:b9', 'vm-uuid': 'b3fa6df3-0cc8-44f5-b1fd-b96469990594'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.550 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 NetworkManager[56360]: <info>  [1772139036.5526] manager: (tap1e16d98a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.553 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.559 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.560 186592 INFO os_vif [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90')
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.561 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.562 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4925885-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.562 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4925885-9f, col_values=(('external_ids', {'iface-id': 'f4925885-9f0a-48b5-be05-d81d7ba1d6e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:fc:f7', 'vm-uuid': '1164f692-eae8-4d3b-8453-9843d5ae0619'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.563 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 NetworkManager[56360]: <info>  [1772139036.5641] manager: (tapf4925885-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.565 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.570 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.570 186592 INFO os_vif [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f')
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.621 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.622 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.622 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] No VIF found with MAC fa:16:3e:fd:fc:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.622 186592 INFO nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Using config drive
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.624 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.625 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.625 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] No VIF found with MAC fa:16:3e:3f:72:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:50:36 compute-0 nova_compute[186588]: 2026-02-26 20:50:36.625 186592 INFO nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Using config drive
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.207 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Successfully created port: 83133bd7-0bf0-46a6-9cda-315762a021e8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.880 186592 INFO nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Creating config drive at /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.config
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.885 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp9k4alh08 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.923 186592 DEBUG nova.network.neutron [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updated VIF entry in instance network info cache for port 1e16d98a-902e-4ff9-ba99-475b6eeba3de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.924 186592 DEBUG nova.network.neutron [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updating instance_info_cache with network_info: [{"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:37 compute-0 nova_compute[186588]: 2026-02-26 20:50:37.942 186592 DEBUG oslo_concurrency.lockutils [req-22eede99-1d2b-4928-adbb-fa4d6f0a0319 req-94b34431-14f1-445c-8a83-1492946ad1d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.006 186592 DEBUG oslo_concurrency.processutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp9k4alh08" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:38 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 26 20:50:38 compute-0 kernel: tapf4925885-9f: entered promiscuous mode
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.0690] manager: (tapf4925885-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00027|binding|INFO|Claiming lport f4925885-9f0a-48b5-be05-d81d7ba1d6e0 for this chassis.
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.073 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00028|binding|INFO|f4925885-9f0a-48b5-be05-d81d7ba1d6e0: Claiming fa:16:3e:fd:fc:f7 10.100.0.9
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.080 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 systemd-udevd[217820]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.1131] device (tapf4925885-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.100 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:fc:f7 10.100.0.9'], port_security=['fa:16:3e:fd:fc:f7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1164f692-eae8-4d3b-8453-9843d5ae0619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa50a5195a5249a1aed0159b8d734e3e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '90ceddb2-db48-44af-b300-f6f6275d4d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9006ad7a-16b4-476d-9d76-258ea58e2c0f, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=f4925885-9f0a-48b5-be05-d81d7ba1d6e0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.102 105929 INFO neutron.agent.ovn.metadata.agent [-] Port f4925885-9f0a-48b5-be05-d81d7ba1d6e0 in datapath 37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 bound to our chassis
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.104 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.105 105929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpt0tpft0o/privsep.sock']
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.1138] device (tapf4925885-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.115 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00029|binding|INFO|Setting lport f4925885-9f0a-48b5-be05-d81d7ba1d6e0 ovn-installed in OVS
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00030|binding|INFO|Setting lport f4925885-9f0a-48b5-be05-d81d7ba1d6e0 up in Southbound
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.122 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 systemd-machined[155924]: New machine qemu-1-instance-00000002.
Feb 26 20:50:38 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.191 186592 INFO nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Creating config drive at /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.config
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.196 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpmsyj1jen execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.315 186592 DEBUG oslo_concurrency.processutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpmsyj1jen" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:38 compute-0 kernel: tap1e16d98a-90: entered promiscuous mode
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.3640] manager: (tap1e16d98a-90): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Feb 26 20:50:38 compute-0 systemd-udevd[217823]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00031|binding|INFO|Claiming lport 1e16d98a-902e-4ff9-ba99-475b6eeba3de for this chassis.
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00032|binding|INFO|1e16d98a-902e-4ff9-ba99-475b6eeba3de: Claiming fa:16:3e:3f:72:b9 10.100.0.4
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.373 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.378 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.3828] device (tap1e16d98a-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:50:38 compute-0 NetworkManager[56360]: <info>  [1772139038.3834] device (tap1e16d98a-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:50:38 compute-0 systemd-machined[155924]: New machine qemu-2-instance-00000001.
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00033|binding|INFO|Setting lport 1e16d98a-902e-4ff9-ba99-475b6eeba3de ovn-installed in OVS
Feb 26 20:50:38 compute-0 ovn_controller[96598]: 2026-02-26T20:50:38Z|00034|binding|INFO|Setting lport 1e16d98a-902e-4ff9-ba99-475b6eeba3de up in Southbound
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.394 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:72:b9 10.100.0.4'], port_security=['fa:16:3e:3f:72:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b3fa6df3-0cc8-44f5-b1fd-b96469990594', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '779290b5e1b1404b9197ae3c548b298e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '748ec538-28fd-46c5-ba4d-88f67eb179f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e0c71a4-541f-451c-b68c-f5e9d0b98930, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=1e16d98a-902e-4ff9-ba99-475b6eeba3de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.395 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:38 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.481 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139038.4809709, 1164f692-eae8-4d3b-8453-9843d5ae0619 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.481 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] VM Started (Lifecycle Event)
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.539 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.542 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139038.4834907, 1164f692-eae8-4d3b-8453-9843d5ae0619 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.543 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] VM Paused (Lifecycle Event)
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.565 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.569 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.589 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.672 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.672 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.688 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.763 186592 DEBUG nova.compute.manager [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-changed-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.763 186592 DEBUG nova.compute.manager [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Refreshing instance network info cache due to event network-changed-f4925885-9f0a-48b5-be05-d81d7ba1d6e0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.763 186592 DEBUG oslo_concurrency.lockutils [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.763 186592 DEBUG oslo_concurrency.lockutils [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.763 186592 DEBUG nova.network.neutron [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Refreshing network info cache for port f4925885-9f0a-48b5-be05-d81d7ba1d6e0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.765 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139038.7600515, b3fa6df3-0cc8-44f5-b1fd-b96469990594 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.765 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] VM Started (Lifecycle Event)
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.781 105929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.782 105929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpt0tpft0o/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.627 217873 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.631 217873 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.633 217873 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.633 217873 INFO oslo.privsep.daemon [-] privsep daemon running as pid 217873
Feb 26 20:50:38 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:38.784 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ee47f92d-92be-4984-818c-d592b0f92325]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.789 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.790 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.790 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.796 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.797 186592 INFO nova.compute.claims [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.801 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139038.7647524, b3fa6df3-0cc8-44f5-b1fd-b96469990594 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.801 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] VM Paused (Lifecycle Event)
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.839 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.844 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:38 compute-0 nova_compute[186588]: 2026-02-26 20:50:38.865 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.008 186592 DEBUG nova.compute.provider_tree [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.023 186592 DEBUG nova.scheduler.client.report [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.048 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.048 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.090 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.091 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.107 186592 INFO nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.132 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.218 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.220 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.220 186592 INFO nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Creating image(s)
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.220 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.221 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.221 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.234 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.269 217873 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.269 217873 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.269 217873 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.279 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.280 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.280 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.291 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.340 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.341 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.366 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.367 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.367 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.413 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.414 186592 DEBUG nova.virt.disk.api [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Checking if we can resize image /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.414 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.467 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.468 186592 DEBUG nova.virt.disk.api [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Cannot resize image /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.468 186592 DEBUG nova.objects.instance [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lazy-loading 'migration_context' on Instance uuid c6227533-c229-4c5d-8090-798e386966a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.472 186592 DEBUG nova.policy [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d0c2e48334a4be2bd9254c186744540', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2787e3da42384259a63c344570077339', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.481 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.481 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Ensure instance console log exists: /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.481 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.482 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.482 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.577 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Successfully updated port: 83133bd7-0bf0-46a6-9cda-315762a021e8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.593 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.593 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquired lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.593 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:50:39 compute-0 nova_compute[186588]: 2026-02-26 20:50:39.626 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.754 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[0ee608c9-c6c0-42c6-82bf-4e2b13b9b080]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.755 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap37f38bfa-b1 in ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.756 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap37f38bfa-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.756 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[085ccd4d-55b8-4d27-a12d-267e7469c9c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.759 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[10bbc3de-0cb2-4fff-9169-2bbe53653a58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.778 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[083e57c6-a46c-4f1a-9dbe-b192cf188187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.798 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ee47f206-d802-4fe5-b129-c6f412194c26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:39.800 105929 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpun1bo4ii/privsep.sock']
Feb 26 20:50:40 compute-0 nova_compute[186588]: 2026-02-26 20:50:40.035 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.434 105929 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.435 105929 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpun1bo4ii/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.313 217909 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.318 217909 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.320 217909 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.320 217909 INFO oslo.privsep.daemon [-] privsep daemon running as pid 217909
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.438 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[1efca332-52b5-477c-af9b-d5ad1f736569]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.871 217909 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.871 217909 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:40.872 217909 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:40 compute-0 nova_compute[186588]: 2026-02-26 20:50:40.983 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Successfully created port: 68e4f67f-e825-4d68-a244-3a15f7c7b5fc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.401 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[63ad8af2-8d6a-4d9e-904c-045407c38294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.420 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[7379794a-02b4-44bb-9d92-060a5c2cbc9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 NetworkManager[56360]: <info>  [1772139041.4222] manager: (tap37f38bfa-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.438 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[ef25ac87-d78d-4d90-8891-b091c383850f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.440 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[dc62d8e8-e2b6-4e6b-85a8-d82332736d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 systemd-udevd[217921]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:41 compute-0 NetworkManager[56360]: <info>  [1772139041.4544] device (tap37f38bfa-b0): carrier: link connected
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.458 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6c5397-384b-4c09-8580-f87fdca93e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.470 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9de2141d-9920-4ab5-bdcb-f3904fb00121]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37f38bfa-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:29:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362801, 'reachable_time': 34668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217939, 'error': None, 'target': 'ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.482 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b16df461-c826-4b16-a199-bbc4480d6610]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:291b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362801, 'tstamp': 362801}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217940, 'error': None, 'target': 'ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.495 186592 DEBUG nova.network.neutron [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.495 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fa977f-2165-4afd-8331-76719214b06f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37f38bfa-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:29:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362801, 'reachable_time': 34668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217941, 'error': None, 'target': 'ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.507 186592 DEBUG nova.network.neutron [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Updated VIF entry in instance network info cache for port f4925885-9f0a-48b5-be05-d81d7ba1d6e0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.508 186592 DEBUG nova.network.neutron [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Updating instance_info_cache with network_info: [{"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.512 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Releasing lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.512 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance network_info: |[{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.515 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start _get_guest_xml network_info=[{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.519 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e79f133d-dad4-41c6-b332-f29a6e14ee15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.521 186592 WARNING nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.526 186592 DEBUG nova.virt.libvirt.host [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.527 186592 DEBUG nova.virt.libvirt.host [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.529 186592 DEBUG oslo_concurrency.lockutils [req-5820a0d4-01cf-417e-a74d-46c4278fd464 req-6f341197-92ca-4719-acfd-afd07b1283ce d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-1164f692-eae8-4d3b-8453-9843d5ae0619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.531 186592 DEBUG nova.virt.libvirt.host [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.532 186592 DEBUG nova.virt.libvirt.host [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.532 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.533 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.533 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.533 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.534 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.534 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.534 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.534 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.535 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.535 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.535 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.536 186592 DEBUG nova.virt.hardware [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.539 186592 DEBUG nova.virt.libvirt.vif [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.539 186592 DEBUG nova.network.os_vif_util [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.540 186592 DEBUG nova.network.os_vif_util [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.541 186592 DEBUG nova.objects.instance [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'pci_devices' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.552 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <uuid>db65189c-3257-4f7c-8407-d99446ead27c</uuid>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <name>instance-00000003</name>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:name>tempest-ServerActionsTestJSON-server-789364433</nova:name>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:50:41</nova:creationTime>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:user uuid="683dc1563e22496ba81bf3253756023f">tempest-ServerActionsTestJSON-377651542-project-member</nova:user>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:project uuid="93f63acb614a4c41813a655e2176374f">tempest-ServerActionsTestJSON-377651542</nova:project>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         <nova:port uuid="83133bd7-0bf0-46a6-9cda-315762a021e8">
Feb 26 20:50:41 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <system>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="serial">db65189c-3257-4f7c-8407-d99446ead27c</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="uuid">db65189c-3257-4f7c-8407-d99446ead27c</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </system>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <os>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </os>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <features>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </features>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:77:0b:72"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <target dev="tap83133bd7-0b"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/console.log" append="off"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <video>
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </video>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:50:41 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:50:41 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:50:41 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:50:41 compute-0 nova_compute[186588]: </domain>
Feb 26 20:50:41 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.553 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Preparing to wait for external event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.553 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.553 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.553 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.554 186592 DEBUG nova.virt.libvirt.vif [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.554 186592 DEBUG nova.network.os_vif_util [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.555 186592 DEBUG nova.network.os_vif_util [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.555 186592 DEBUG os_vif [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.555 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e20696bc-6cb0-492a-a239-7f86a93087e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.555 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.556 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.556 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.556 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37f38bfa-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.557 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.557 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37f38bfa-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 NetworkManager[56360]: <info>  [1772139041.5596] manager: (tap37f38bfa-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 26 20:50:41 compute-0 kernel: tap37f38bfa-b0: entered promiscuous mode
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.558 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.562 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37f38bfa-b0, col_values=(('external_ids', {'iface-id': '5e0ab0a4-420c-4069-8eba-6771e2b585be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 ovn_controller[96598]: 2026-02-26T20:50:41Z|00035|binding|INFO|Releasing lport 5e0ab0a4-420c-4069-8eba-6771e2b585be from this chassis (sb_readonly=0)
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.563 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.564 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.564 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.564 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd5c8eb-227e-4e7f-80b4-2584e63ba968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.565 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83133bd7-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.565 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50.pid.haproxy
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.565 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap83133bd7-0b, col_values=(('external_ids', {'iface-id': '83133bd7-0bf0-46a6-9cda-315762a021e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:0b:72', 'vm-uuid': 'db65189c-3257-4f7c-8407-d99446ead27c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.566 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'env', 'PROCESS_TAG=haproxy-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.566 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 NetworkManager[56360]: <info>  [1772139041.5673] manager: (tap83133bd7-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.568 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.569 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.570 186592 INFO os_vif [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b')
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.610 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.611 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.611 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] No VIF found with MAC fa:16:3e:77:0b:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.612 186592 INFO nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Using config drive
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.827 186592 DEBUG nova.compute.manager [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-changed-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.828 186592 DEBUG nova.compute.manager [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Refreshing instance network info cache due to event network-changed-83133bd7-0bf0-46a6-9cda-315762a021e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.828 186592 DEBUG oslo_concurrency.lockutils [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.828 186592 DEBUG oslo_concurrency.lockutils [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:41 compute-0 nova_compute[186588]: 2026-02-26 20:50:41.828 186592 DEBUG nova.network.neutron [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Refreshing network info cache for port 83133bd7-0bf0-46a6-9cda-315762a021e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:41 compute-0 podman[217976]: 2026-02-26 20:50:41.886701087 +0000 UTC m=+0.045899315 container create 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:50:41 compute-0 systemd[1]: Started libpod-conmon-0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c.scope.
Feb 26 20:50:41 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:50:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4227ca177b5bcc57d4785de9aa5bd7e36bd752ef0880bebd8259cbb02428507b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:50:41 compute-0 podman[217976]: 2026-02-26 20:50:41.953066188 +0000 UTC m=+0.112264426 container init 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:50:41 compute-0 podman[217976]: 2026-02-26 20:50:41.861780903 +0000 UTC m=+0.020979141 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:50:41 compute-0 podman[217976]: 2026-02-26 20:50:41.957773624 +0000 UTC m=+0.116971842 container start 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:50:41 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [NOTICE]   (217997) : New worker (217999) forked
Feb 26 20:50:41 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [NOTICE]   (217997) : Loading success.
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.994 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 1e16d98a-902e-4ff9-ba99-475b6eeba3de in datapath 455d5ac8-4ae4-435d-a896-0f99b4c324cc unbound from our chassis
Feb 26 20:50:41 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:41.996 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455d5ac8-4ae4-435d-a896-0f99b4c324cc
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.001 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[0847005b-7332-4349-bdc2-6c58e9cafcf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.001 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455d5ac8-41 in ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.002 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455d5ac8-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.002 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5901d5-bf19-4ff2-b056-2553d60d1f7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.003 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[73569735-fc03-4ab6-953c-323c8b124492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.015 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a1b298-e260-4b57-9e40-c4be53ff919e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.024 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[dc25094f-008a-46ed-a1c9-2141eec38b6c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.038 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[8efed7ac-d683-4b38-9eff-ee8f27377201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.041 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[a57e9e3f-9514-4b8e-947f-6747c0e86528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.0424] manager: (tap455d5ac8-40): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Feb 26 20:50:42 compute-0 systemd-udevd[217931]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.057 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[87d30856-77db-40b1-b235-e93c06960482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.060 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4575c9-3c5a-4295-9a1a-7b4d966f5c2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.0727] device (tap455d5ac8-40): carrier: link connected
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.075 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[e79ff158-6604-4f17-9760-5667df5ab511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.084 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd2268a-7419-42e6-9dab-90f136ba7893]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455d5ac8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:73:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362863, 'reachable_time': 18520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218022, 'error': None, 'target': 'ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.094 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[0367f7fd-6b3c-4f6d-921c-10fb6cdf571f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:73d4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362863, 'tstamp': 362863}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218023, 'error': None, 'target': 'ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.104 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[2953e5f6-cdf1-4264-bd01-fd9ed1b9e493]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455d5ac8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:73:d4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362863, 'reachable_time': 18520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218024, 'error': None, 'target': 'ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.120 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb6f734-9d0d-402e-a4e9-83f104c10d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.155 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[79c94325-4a06-490a-819d-ae1a944d7d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.156 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455d5ac8-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.156 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.157 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455d5ac8-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.162 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.1630] manager: (tap455d5ac8-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 26 20:50:42 compute-0 kernel: tap455d5ac8-40: entered promiscuous mode
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.166 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.167 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455d5ac8-40, col_values=(('external_ids', {'iface-id': 'a2b5842f-50ae-46d2-bb42-c826042f9dfe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.168 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00036|binding|INFO|Releasing lport a2b5842f-50ae-46d2-bb42-c826042f9dfe from this chassis (sb_readonly=0)
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.173 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.175 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.175 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455d5ac8-4ae4-435d-a896-0f99b4c324cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455d5ac8-4ae4-435d-a896-0f99b4c324cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.176 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[884ca216-8f81-4873-944e-8306f80d6807]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.176 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-455d5ac8-4ae4-435d-a896-0f99b4c324cc
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/455d5ac8-4ae4-435d-a896-0f99b4c324cc.pid.haproxy
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 455d5ac8-4ae4-435d-a896-0f99b4c324cc
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.177 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'env', 'PROCESS_TAG=haproxy-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455d5ac8-4ae4-435d-a896-0f99b4c324cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:50:42 compute-0 podman[218059]: 2026-02-26 20:50:42.461665987 +0000 UTC m=+0.043654856 container create efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.485 186592 INFO nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Creating config drive at /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.490 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmph4bq3ys9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:42 compute-0 systemd[1]: Started libpod-conmon-efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac.scope.
Feb 26 20:50:42 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:50:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94776e599f1aacd9fae1ed88daa85fc18d7891a8e5de5ac4a04b3b7886c01fcf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:50:42 compute-0 podman[218059]: 2026-02-26 20:50:42.519369286 +0000 UTC m=+0.101358175 container init efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:50:42 compute-0 podman[218059]: 2026-02-26 20:50:42.52401133 +0000 UTC m=+0.106000199 container start efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:50:42 compute-0 podman[218059]: 2026-02-26 20:50:42.440911073 +0000 UTC m=+0.022899972 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:50:42 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [NOTICE]   (218082) : New worker (218084) forked
Feb 26 20:50:42 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [NOTICE]   (218082) : Loading success.
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.607 186592 DEBUG oslo_concurrency.processutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmph4bq3ys9" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.6446] manager: (tap83133bd7-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Feb 26 20:50:42 compute-0 kernel: tap83133bd7-0b: entered promiscuous mode
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00037|binding|INFO|Claiming lport 83133bd7-0bf0-46a6-9cda-315762a021e8 for this chassis.
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00038|binding|INFO|83133bd7-0bf0-46a6-9cda-315762a021e8: Claiming fa:16:3e:77:0b:72 10.100.0.14
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.646 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.652 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.6551] device (tap83133bd7-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.6571] device (tap83133bd7-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.664 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:0b:72 10.100.0.14'], port_security=['fa:16:3e:77:0b:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'db65189c-3257-4f7c-8407-d99446ead27c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8912f988-fb86-4f9a-91d3-d98453103e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93f63acb614a4c41813a655e2176374f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'af61bd30-342c-4238-9c48-29adad8f0e57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4782c29f-d92e-43fa-8dcd-4ddac552e07a, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=83133bd7-0bf0-46a6-9cda-315762a021e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.666 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 83133bd7-0bf0-46a6-9cda-315762a021e8 in datapath 8912f988-fb86-4f9a-91d3-d98453103e4e bound to our chassis
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00039|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 ovn-installed in OVS
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00040|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 up in Southbound
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.669 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.669 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 systemd-machined[155924]: New machine qemu-3-instance-00000003.
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.680 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9a61f13d-926c-4f3e-99a1-736e4eac23d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.681 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8912f988-f1 in ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.683 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8912f988-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.683 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab092ce-5274-4cba-b3fc-3ad7a6f90645]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.684 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9350c9-6900-4adc-963f-4643c136b352]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.698 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef6b534-8517-4c61-a899-856174ea3808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.717 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f3ee8c-b040-4b3e-8bc3-0c53cd63a70d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.733 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[d67ef46b-f4d9-455b-b489-7f24b855587d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.737 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d82016a8-c8d9-4706-98a1-71145fc18bd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.7387] manager: (tap8912f988-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.754 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[e500a754-0e81-4e80-8c43-74efb442c53c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.756 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[e516804f-dfcb-4af2-aaca-2bbdcf252b8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.7685] device (tap8912f988-f0): carrier: link connected
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.770 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[760299d0-d810-4d11-b8e3-f9a20f24f362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.783 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[522a5ae1-e6d3-4304-b6fd-752f5e5473e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8912f988-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:0d:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362932, 'reachable_time': 18414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218125, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.793 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[651d0860-4661-470a-9fd6-b6c501850679]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:dc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362932, 'tstamp': 362932}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218127, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.806 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[1b69983e-5284-4697-93d1-6cdcc29900a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8912f988-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:0d:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362932, 'reachable_time': 18414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218129, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.826 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef3b302-dbd7-44b9-8a46-f674f2b68d97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.861 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd294a6-d3b2-4b65-bcce-63e67375126d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.862 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8912f988-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.862 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.863 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8912f988-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.864 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 NetworkManager[56360]: <info>  [1772139042.8651] manager: (tap8912f988-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 26 20:50:42 compute-0 kernel: tap8912f988-f0: entered promiscuous mode
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.866 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.868 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139042.867973, db65189c-3257-4f7c-8407-d99446ead27c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.868 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Started (Lifecycle Event)
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.868 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8912f988-f0, col_values=(('external_ids', {'iface-id': 'feef4d0a-7ad6-4fc7-99f1-0f847997a8be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:42 compute-0 ovn_controller[96598]: 2026-02-26T20:50:42Z|00041|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.870 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.871 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.872 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e425e7-a8c6-45f2-94a8-90e5087c4515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.873 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:50:42 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:42.873 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'env', 'PROCESS_TAG=haproxy-8912f988-fb86-4f9a-91d3-d98453103e4e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8912f988-fb86-4f9a-91d3-d98453103e4e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.873 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.892 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.895 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139042.868087, db65189c-3257-4f7c-8407-d99446ead27c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.895 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Paused (Lifecycle Event)
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.913 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.915 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:42 compute-0 nova_compute[186588]: 2026-02-26 20:50:42.932 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:43 compute-0 podman[218166]: 2026-02-26 20:50:43.157888611 +0000 UTC m=+0.043942252 container create 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:50:43 compute-0 systemd[1]: Started libpod-conmon-87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e.scope.
Feb 26 20:50:43 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:50:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5af58596274cb205068458f18d8181bd7bc35cae1886cc24509987ce2192fc75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:50:43 compute-0 podman[218166]: 2026-02-26 20:50:43.216343751 +0000 UTC m=+0.102397392 container init 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 26 20:50:43 compute-0 podman[218166]: 2026-02-26 20:50:43.221015736 +0000 UTC m=+0.107069377 container start 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, tcib_managed=true, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0)
Feb 26 20:50:43 compute-0 podman[218166]: 2026-02-26 20:50:43.134048116 +0000 UTC m=+0.020101757 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:50:43 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [NOTICE]   (218185) : New worker (218187) forked
Feb 26 20:50:43 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [NOTICE]   (218185) : Loading success.
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.499 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Successfully updated port: 68e4f67f-e825-4d68-a244-3a15f7c7b5fc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.518 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.519 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquired lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.519 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.732 186592 DEBUG nova.network.neutron [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updated VIF entry in instance network info cache for port 83133bd7-0bf0-46a6-9cda-315762a021e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.733 186592 DEBUG nova.network.neutron [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.748 186592 DEBUG oslo_concurrency.lockutils [req-9de37cee-8245-4c86-b56b-086c02cf0f88 req-c219b7a4-d2a6-45a3-a3aa-fb86fcfe30d0 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:43 compute-0 nova_compute[186588]: 2026-02-26 20:50:43.755 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:50:44 compute-0 podman[218197]: 2026-02-26 20:50:44.542166762 +0000 UTC m=+0.053719234 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 26 20:50:44 compute-0 podman[218196]: 2026-02-26 20:50:44.542941604 +0000 UTC m=+0.055970004 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.581 186592 DEBUG nova.compute.manager [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-changed-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.581 186592 DEBUG nova.compute.manager [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Refreshing instance network info cache due to event network-changed-68e4f67f-e825-4d68-a244-3a15f7c7b5fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.582 186592 DEBUG oslo_concurrency.lockutils [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:44 compute-0 podman[218198]: 2026-02-26 20:50:44.58668831 +0000 UTC m=+0.088853020 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.628 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.894 186592 DEBUG nova.compute.manager [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.895 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.895 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.895 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.895 186592 DEBUG nova.compute.manager [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Processing event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.896 186592 DEBUG nova.compute.manager [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.896 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.896 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.896 186592 DEBUG oslo_concurrency.lockutils [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.896 186592 DEBUG nova.compute.manager [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] No waiting events found dispatching network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.897 186592 WARNING nova.compute.manager [req-f5f2ac14-a5f0-4c17-8303-eb3957b51d3a req-98c7f162-f910-4828-a11f-464a613015b5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received unexpected event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 for instance with vm_state building and task_state spawning.
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.897 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.900 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139044.900195, 1164f692-eae8-4d3b-8453-9843d5ae0619 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.900 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] VM Resumed (Lifecycle Event)
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.901 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.912 186592 DEBUG nova.network.neutron [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updating instance_info_cache with network_info: [{"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.914 186592 INFO nova.virt.libvirt.driver [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Instance spawned successfully.
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.914 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.951 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.954 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.978 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Releasing lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.979 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Instance network_info: |[{"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.979 186592 DEBUG oslo_concurrency.lockutils [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.979 186592 DEBUG nova.network.neutron [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Refreshing network info cache for port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.981 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Start _get_guest_xml network_info=[{"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.984 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.985 186592 WARNING nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.991 186592 DEBUG nova.virt.libvirt.host [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.992 186592 DEBUG nova.virt.libvirt.host [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.993 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.993 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.994 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.994 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.994 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:44 compute-0 nova_compute[186588]: 2026-02-26 20:50:44.994 186592 DEBUG nova.virt.libvirt.driver [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.004 186592 DEBUG nova.virt.libvirt.host [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.005 186592 DEBUG nova.virt.libvirt.host [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.005 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.005 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.005 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.005 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.006 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.007 186592 DEBUG nova.virt.hardware [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.009 186592 DEBUG nova.virt.libvirt.vif [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-522947647',display_name='tempest-ServersTestJSON-server-522947647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-522947647',id=4,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCrPFl+v17Hn6iV8B9R0+nbNbH6QR7ehTrmQvjX99UNq31cmXwGNt5I6fEWKKXTLtkPw/vAp5D1nI6Gl8sX5U1FcxQ+XyzR7S3yu1x+7EGRvVGFByxeXAQAB1Lc9pvDydg==',key_name='tempest-keypair-182490886',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2787e3da42384259a63c344570077339',ramdisk_id='',reservation_id='r-qxetr1bm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1382364450',owner_user_name='tempest-ServersTestJSON-1382364450-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d0c2e48334a4be2bd9254c186744540',uuid=c6227533-c229-4c5d-8090-798e386966a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.009 186592 DEBUG nova.network.os_vif_util [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converting VIF {"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.010 186592 DEBUG nova.network.os_vif_util [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.011 186592 DEBUG nova.objects.instance [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6227533-c229-4c5d-8090-798e386966a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.028 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <uuid>c6227533-c229-4c5d-8090-798e386966a1</uuid>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <name>instance-00000004</name>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:name>tempest-ServersTestJSON-server-522947647</nova:name>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:50:44</nova:creationTime>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:user uuid="6d0c2e48334a4be2bd9254c186744540">tempest-ServersTestJSON-1382364450-project-member</nova:user>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:project uuid="2787e3da42384259a63c344570077339">tempest-ServersTestJSON-1382364450</nova:project>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         <nova:port uuid="68e4f67f-e825-4d68-a244-3a15f7c7b5fc">
Feb 26 20:50:45 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <system>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="serial">c6227533-c229-4c5d-8090-798e386966a1</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="uuid">c6227533-c229-4c5d-8090-798e386966a1</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </system>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <os>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </os>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <features>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </features>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.config"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:b9:9c:46"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <target dev="tap68e4f67f-e8"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/console.log" append="off"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <video>
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </video>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:50:45 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:50:45 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:50:45 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:50:45 compute-0 nova_compute[186588]: </domain>
Feb 26 20:50:45 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.028 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Preparing to wait for external event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.029 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.029 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.029 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.029 186592 DEBUG nova.virt.libvirt.vif [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:50:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-522947647',display_name='tempest-ServersTestJSON-server-522947647',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-522947647',id=4,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCrPFl+v17Hn6iV8B9R0+nbNbH6QR7ehTrmQvjX99UNq31cmXwGNt5I6fEWKKXTLtkPw/vAp5D1nI6Gl8sX5U1FcxQ+XyzR7S3yu1x+7EGRvVGFByxeXAQAB1Lc9pvDydg==',key_name='tempest-keypair-182490886',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2787e3da42384259a63c344570077339',ramdisk_id='',reservation_id='r-qxetr1bm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1382364450',owner_user_name='tempest-ServersTestJSON-1382364450-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:50:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d0c2e48334a4be2bd9254c186744540',uuid=c6227533-c229-4c5d-8090-798e386966a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.030 186592 DEBUG nova.network.os_vif_util [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converting VIF {"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.030 186592 DEBUG nova.network.os_vif_util [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.030 186592 DEBUG os_vif [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.031 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.031 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.031 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.033 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.033 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68e4f67f-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.033 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68e4f67f-e8, col_values=(('external_ids', {'iface-id': '68e4f67f-e825-4d68-a244-3a15f7c7b5fc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:9c:46', 'vm-uuid': 'c6227533-c229-4c5d-8090-798e386966a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.035 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.0358] manager: (tap68e4f67f-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.037 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.040 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.041 186592 INFO os_vif [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8')
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.062 186592 INFO nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Took 13.49 seconds to spawn the instance on the hypervisor.
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.063 186592 DEBUG nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.105 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.106 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.106 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] No VIF found with MAC fa:16:3e:b9:9c:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.106 186592 INFO nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Using config drive
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.151 186592 INFO nova.compute.manager [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Took 14.57 seconds to build instance.
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.177 186592 DEBUG oslo_concurrency.lockutils [None req-34ee2c11-fd98-4ab8-a09b-9254e5ffc2ea f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.523 186592 INFO nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Creating config drive at /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.config
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.529 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpnnmz53j5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.647 186592 DEBUG oslo_concurrency.processutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpnnmz53j5" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.7061] manager: (tap68e4f67f-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 26 20:50:45 compute-0 kernel: tap68e4f67f-e8: entered promiscuous mode
Feb 26 20:50:45 compute-0 systemd-udevd[218281]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.7494] device (tap68e4f67f-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.7504] device (tap68e4f67f-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:50:45 compute-0 ovn_controller[96598]: 2026-02-26T20:50:45Z|00042|binding|INFO|Claiming lport 68e4f67f-e825-4d68-a244-3a15f7c7b5fc for this chassis.
Feb 26 20:50:45 compute-0 ovn_controller[96598]: 2026-02-26T20:50:45Z|00043|binding|INFO|68e4f67f-e825-4d68-a244-3a15f7c7b5fc: Claiming fa:16:3e:b9:9c:46 10.100.0.11
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.771 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.777 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:9c:46 10.100.0.11'], port_security=['fa:16:3e:b9:9c:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c6227533-c229-4c5d-8090-798e386966a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad05404a-1e87-4e64-8943-380ca32a9699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2787e3da42384259a63c344570077339', 'neutron:revision_number': '2', 'neutron:security_group_ids': '580e7808-3fb2-47c8-b074-3981ddea0b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b7a2450-cb62-420f-a43c-7c6813b9bd3a, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=68e4f67f-e825-4d68-a244-3a15f7c7b5fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.780 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc in datapath ad05404a-1e87-4e64-8943-380ca32a9699 bound to our chassis
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.783 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad05404a-1e87-4e64-8943-380ca32a9699
Feb 26 20:50:45 compute-0 systemd-machined[155924]: New machine qemu-4-instance-00000004.
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.787 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 ovn_controller[96598]: 2026-02-26T20:50:45Z|00044|binding|INFO|Setting lport 68e4f67f-e825-4d68-a244-3a15f7c7b5fc ovn-installed in OVS
Feb 26 20:50:45 compute-0 ovn_controller[96598]: 2026-02-26T20:50:45Z|00045|binding|INFO|Setting lport 68e4f67f-e825-4d68-a244-3a15f7c7b5fc up in Southbound
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.790 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.795 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b59f863c-7e8d-4c48-ab35-0d37f55fb0f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.797 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad05404a-11 in ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:50:45 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.799 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad05404a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.799 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[8e84b368-66ec-44aa-8182-609ff73645ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.800 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[caa29be2-b3b5-476d-864c-b84a6c43bc7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.820 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf803de-c9fe-4337-98fe-d17004ded0d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.829 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb02ff9-8f07-403d-b6b4-722fe242defa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.844 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[67051766-f653-4e36-8877-7a22e6d32419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.848 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[01640149-fbae-4ae6-8c11-bede295be7e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.8494] manager: (tapad05404a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 26 20:50:45 compute-0 systemd-udevd[218283]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.871 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c88da4-9cd8-4dc2-a8b1-44f2a1507e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.873 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7da15d-155f-4268-823d-f5628596b63e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.8852] device (tapad05404a-10): carrier: link connected
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.892 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[0817f62d-8611-4944-b8c6-11acc0c99b2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.905 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[3810f2de-6d1b-4511-9a37-2166b781d207]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad05404a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:13:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363244, 'reachable_time': 24597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218317, 'error': None, 'target': 'ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.914 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b0baedf5-de5a-48cb-b632-49b61b35d847]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:1373'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363244, 'tstamp': 363244}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218318, 'error': None, 'target': 'ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.924 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb331d7-1e71-4d56-9aeb-f9673e99d0a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad05404a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:13:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363244, 'reachable_time': 24597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218319, 'error': None, 'target': 'ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.940 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[31a3fc30-e748-47de-8076-0aef29b51179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.970 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[2931fdb8-e1dd-48d6-b71d-1df22f6a3875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.971 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad05404a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.972 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.972 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad05404a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 NetworkManager[56360]: <info>  [1772139045.9747] manager: (tapad05404a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 26 20:50:45 compute-0 kernel: tapad05404a-10: entered promiscuous mode
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.973 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.978 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad05404a-10, col_values=(('external_ids', {'iface-id': 'b0a071c6-ff84-4fda-ba8d-9af2ad9d05c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:45 compute-0 ovn_controller[96598]: 2026-02-26T20:50:45Z|00046|binding|INFO|Releasing lport b0a071c6-ff84-4fda-ba8d-9af2ad9d05c6 from this chassis (sb_readonly=0)
Feb 26 20:50:45 compute-0 nova_compute[186588]: 2026-02-26 20:50:45.982 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.982 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad05404a-1e87-4e64-8943-380ca32a9699.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad05404a-1e87-4e64-8943-380ca32a9699.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.983 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[5df9befc-dc10-406a-a414-56e7e50077fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.983 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-ad05404a-1e87-4e64-8943-380ca32a9699
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/ad05404a-1e87-4e64-8943-380ca32a9699.pid.haproxy
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID ad05404a-1e87-4e64-8943-380ca32a9699
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:50:45 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:45.984 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699', 'env', 'PROCESS_TAG=haproxy-ad05404a-1e87-4e64-8943-380ca32a9699', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad05404a-1e87-4e64-8943-380ca32a9699.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.298 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139046.2976184, c6227533-c229-4c5d-8090-798e386966a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.298 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] VM Started (Lifecycle Event)
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.325 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.329 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139046.2982318, c6227533-c229-4c5d-8090-798e386966a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.329 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] VM Paused (Lifecycle Event)
Feb 26 20:50:46 compute-0 podman[218359]: 2026-02-26 20:50:46.333078363 +0000 UTC m=+0.049421840 container create 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223)
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.350 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.354 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:46 compute-0 systemd[1]: Started libpod-conmon-29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c.scope.
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.384 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:46 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:50:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a65425e98fbca288e0d22b302466e07c7943841a6ced5fea43111e454da95de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:50:46 compute-0 podman[218359]: 2026-02-26 20:50:46.310041618 +0000 UTC m=+0.026385115 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:50:46 compute-0 podman[218359]: 2026-02-26 20:50:46.412363998 +0000 UTC m=+0.128707505 container init 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:50:46 compute-0 podman[218359]: 2026-02-26 20:50:46.416400805 +0000 UTC m=+0.132744292 container start 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:50:46 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [NOTICE]   (218378) : New worker (218380) forked
Feb 26 20:50:46 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [NOTICE]   (218378) : Loading success.
Feb 26 20:50:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:46.521 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:46.521 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:46.522 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.612 186592 DEBUG nova.network.neutron [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updated VIF entry in instance network info cache for port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.613 186592 DEBUG nova.network.neutron [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updating instance_info_cache with network_info: [{"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:46 compute-0 nova_compute[186588]: 2026-02-26 20:50:46.646 186592 DEBUG oslo_concurrency.lockutils [req-c6c3a597-14e1-49f2-87d8-77bc3c1c902a req-1419dd7f-1386-40f6-8c61-1f44796fb2ca d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.275 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.276 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.276 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.276 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.276 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Processing event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] No waiting events found dispatching network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.277 186592 WARNING nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received unexpected event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de for instance with vm_state building and task_state spawning.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Processing event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.278 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.279 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.279 186592 DEBUG oslo_concurrency.lockutils [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.279 186592 DEBUG nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.279 186592 WARNING nova.compute.manager [req-349dd4e9-e044-44d2-bdbb-dd072f8b1670 req-b06cf615-bea8-4116-83fe-211e04753fb1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state building and task_state spawning.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.280 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.280 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.284 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139047.2842739, b3fa6df3-0cc8-44f5-b1fd-b96469990594 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.284 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] VM Resumed (Lifecycle Event)
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.286 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.287 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.291 186592 INFO nova.virt.libvirt.driver [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance spawned successfully.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.292 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.295 186592 INFO nova.virt.libvirt.driver [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Instance spawned successfully.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.297 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.314 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.324 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.328 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.328 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.329 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.329 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.329 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.330 186592 DEBUG nova.virt.libvirt.driver [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.335 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.336 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.336 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.336 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.337 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.337 186592 DEBUG nova.virt.libvirt.driver [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.343 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.344 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139047.2843568, db65189c-3257-4f7c-8407-d99446ead27c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.345 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Resumed (Lifecycle Event)
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.350 186592 DEBUG nova.compute.manager [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.350 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.350 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.351 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.351 186592 DEBUG nova.compute.manager [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Processing event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.351 186592 DEBUG nova.compute.manager [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.351 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.352 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.352 186592 DEBUG oslo_concurrency.lockutils [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.352 186592 DEBUG nova.compute.manager [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] No waiting events found dispatching network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.352 186592 WARNING nova.compute.manager [req-cd1098cb-fe26-4612-ae22-0c55b23cb86e req-4b5b7851-bbc5-4d4e-94cd-8fb1fc8f3508 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received unexpected event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc for instance with vm_state building and task_state spawning.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.362 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.369 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.373 186592 INFO nova.virt.libvirt.driver [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] Instance spawned successfully.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.373 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.435 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.440 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.442 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.443 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.443 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.444 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.444 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.445 186592 DEBUG nova.virt.libvirt.driver [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.485 186592 INFO nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Took 12.14 seconds to spawn the instance on the hypervisor.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.486 186592 DEBUG nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.486 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.487 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139047.3678584, c6227533-c229-4c5d-8090-798e386966a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.487 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] VM Resumed (Lifecycle Event)
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.533 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.537 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.551 186592 INFO nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Took 19.85 seconds to spawn the instance on the hypervisor.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.552 186592 DEBUG nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.693 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.716 186592 INFO nova.compute.manager [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Took 13.06 seconds to build instance.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.749 186592 DEBUG oslo_concurrency.lockutils [None req-16e1ba7b-e755-4402-a2aa-acb77f72d88f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.753 186592 INFO nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Took 8.53 seconds to spawn the instance on the hypervisor.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.754 186592 DEBUG nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.756 186592 INFO nova.compute.manager [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Took 20.44 seconds to build instance.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.799 186592 DEBUG oslo_concurrency.lockutils [None req-300cf4f0-8b82-4e0b-aa4b-8f0272d79ddd abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.836 186592 INFO nova.compute.manager [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Took 9.07 seconds to build instance.
Feb 26 20:50:47 compute-0 nova_compute[186588]: 2026-02-26 20:50:47.869 186592 DEBUG oslo_concurrency.lockutils [None req-71990f7a-73f7-42bd-8c54-103464abbfd7 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.049 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.049 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.050 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.050 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.050 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.051 186592 INFO nova.compute.manager [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Terminating instance
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.052 186592 DEBUG nova.compute.manager [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:50:49 compute-0 kernel: tapf4925885-9f (unregistering): left promiscuous mode
Feb 26 20:50:49 compute-0 NetworkManager[56360]: <info>  [1772139049.0824] device (tapf4925885-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:50:49 compute-0 ovn_controller[96598]: 2026-02-26T20:50:49Z|00047|binding|INFO|Releasing lport f4925885-9f0a-48b5-be05-d81d7ba1d6e0 from this chassis (sb_readonly=0)
Feb 26 20:50:49 compute-0 ovn_controller[96598]: 2026-02-26T20:50:49Z|00048|binding|INFO|Setting lport f4925885-9f0a-48b5-be05-d81d7ba1d6e0 down in Southbound
Feb 26 20:50:49 compute-0 ovn_controller[96598]: 2026-02-26T20:50:49Z|00049|binding|INFO|Removing iface tapf4925885-9f ovn-installed in OVS
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.091 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.099 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:fc:f7 10.100.0.9'], port_security=['fa:16:3e:fd:fc:f7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1164f692-eae8-4d3b-8453-9843d5ae0619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa50a5195a5249a1aed0159b8d734e3e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '90ceddb2-db48-44af-b300-f6f6275d4d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9006ad7a-16b4-476d-9d76-258ea58e2c0f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=f4925885-9f0a-48b5-be05-d81d7ba1d6e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.100 105929 INFO neutron.agent.ovn.metadata.agent [-] Port f4925885-9f0a-48b5-be05-d81d7ba1d6e0 in datapath 37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 unbound from our chassis
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.103 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.103 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.106 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f3eab524-0100-471f-a600-750762ab640c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.108 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 namespace which is not needed anymore
Feb 26 20:50:49 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 26 20:50:49 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 4.488s CPU time.
Feb 26 20:50:49 compute-0 systemd-machined[155924]: Machine qemu-1-instance-00000002 terminated.
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [NOTICE]   (217997) : haproxy version is 2.8.14-c23fe91
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [NOTICE]   (217997) : path to executable is /usr/sbin/haproxy
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [WARNING]  (217997) : Exiting Master process...
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [WARNING]  (217997) : Exiting Master process...
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [ALERT]    (217997) : Current worker (217999) exited with code 143 (Terminated)
Feb 26 20:50:49 compute-0 neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50[217992]: [WARNING]  (217997) : All workers exited. Exiting... (0)
Feb 26 20:50:49 compute-0 systemd[1]: libpod-0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c.scope: Deactivated successfully.
Feb 26 20:50:49 compute-0 podman[218412]: 2026-02-26 20:50:49.230391841 +0000 UTC m=+0.040075860 container died 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0)
Feb 26 20:50:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c-userdata-shm.mount: Deactivated successfully.
Feb 26 20:50:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-4227ca177b5bcc57d4785de9aa5bd7e36bd752ef0880bebd8259cbb02428507b-merged.mount: Deactivated successfully.
Feb 26 20:50:49 compute-0 podman[218412]: 2026-02-26 20:50:49.269870814 +0000 UTC m=+0.079554833 container cleanup 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:50:49 compute-0 systemd[1]: libpod-conmon-0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c.scope: Deactivated successfully.
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.307 186592 INFO nova.virt.libvirt.driver [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Instance destroyed successfully.
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.308 186592 DEBUG nova.objects.instance [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lazy-loading 'resources' on Instance uuid 1164f692-eae8-4d3b-8453-9843d5ae0619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:49 compute-0 podman[218445]: 2026-02-26 20:50:49.332673609 +0000 UTC m=+0.043854650 container remove 0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.337 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e736e07c-4a97-4656-8edc-3e8bec6bf74d]: (4, ('Thu Feb 26 08:50:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 (0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c)\n0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c\nThu Feb 26 08:50:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 (0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c)\n0f00f61e168bd34bc07fb612b889e30fcf89066eef4572e7830797101c41828c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.340 186592 DEBUG nova.virt.libvirt.vif [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1796184787',display_name='tempest-ServerAddressesTestJSON-server-1796184787',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1796184787',id=2,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fa50a5195a5249a1aed0159b8d734e3e',ramdisk_id='',reservation_id='r-3fo1m4fg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1304881296',owner_user_name='tempest-ServerAddressesTestJSON-1304881296-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:50:45Z,user_data=None,user_id='f7e7830e29f34940834b0dc390272550',uuid=1164f692-eae8-4d3b-8453-9843d5ae0619,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.340 186592 DEBUG nova.network.os_vif_util [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converting VIF {"id": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "address": "fa:16:3e:fd:fc:f7", "network": {"id": "37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-561419759-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa50a5195a5249a1aed0159b8d734e3e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4925885-9f", "ovs_interfaceid": "f4925885-9f0a-48b5-be05-d81d7ba1d6e0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.340 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b47e67c5-ea72-4f35-8f5a-7c42d2324326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.341 186592 DEBUG nova.network.os_vif_util [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.341 186592 DEBUG os_vif [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.341 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37f38bfa-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.342 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.342 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4925885-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.389 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:49 compute-0 kernel: tap37f38bfa-b0: left promiscuous mode
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.400 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3f0e1d-9489-44cd-866c-4d2b7a50d271]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.401 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.408 186592 INFO os_vif [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:fc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f4925885-9f0a-48b5-be05-d81d7ba1d6e0,network=Network(37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4925885-9f')
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.408 186592 INFO nova.virt.libvirt.driver [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Deleting instance files /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619_del
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.409 186592 INFO nova.virt.libvirt.driver [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Deletion of /var/lib/nova/instances/1164f692-eae8-4d3b-8453-9843d5ae0619_del complete
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.411 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[1200e272-3122-4d86-9f4f-a42dcb65eef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.412 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd68deb-491e-49d3-ad6a-8694d66df587]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.428 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f54bad15-150c-400a-99ee-42608c7e7bf8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362795, 'reachable_time': 15946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218470, 'error': None, 'target': 'ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d37f38bfa\x2dbdb3\x2d4cc3\x2db4cb\x2d39f5ee366b50.mount: Deactivated successfully.
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.442 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-37f38bfa-bdb3-4cc3-b4cb-39f5ee366b50 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:50:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:49.443 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[943ddcd4-6af8-4369-87c7-4fffef833416]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.475 186592 DEBUG nova.virt.libvirt.host [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.475 186592 INFO nova.virt.libvirt.host [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] UEFI support detected
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.477 186592 INFO nova.compute.manager [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Took 0.42 seconds to destroy the instance on the hypervisor.
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.478 186592 DEBUG oslo.service.loopingcall [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.478 186592 DEBUG nova.compute.manager [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.478 186592 DEBUG nova.network.neutron [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:50:49 compute-0 nova_compute[186588]: 2026-02-26 20:50:49.631 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.0998] manager: (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/35)
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1017] device (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <warn>  [1772139050.1020] device (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1029] manager: (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/36)
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1033] device (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <warn>  [1772139050.1035] device (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1043] manager: (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1050] manager: (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1056] device (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 26 20:50:50 compute-0 NetworkManager[56360]: <info>  [1772139050.1062] device (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.106 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.122 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:50 compute-0 ovn_controller[96598]: 2026-02-26T20:50:50Z|00050|binding|INFO|Releasing lport a2b5842f-50ae-46d2-bb42-c826042f9dfe from this chassis (sb_readonly=0)
Feb 26 20:50:50 compute-0 ovn_controller[96598]: 2026-02-26T20:50:50Z|00051|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:50:50 compute-0 ovn_controller[96598]: 2026-02-26T20:50:50Z|00052|binding|INFO|Releasing lport b0a071c6-ff84-4fda-ba8d-9af2ad9d05c6 from this chassis (sb_readonly=0)
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.146 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.187 186592 DEBUG nova.compute.manager [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-unplugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.187 186592 DEBUG oslo_concurrency.lockutils [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.188 186592 DEBUG oslo_concurrency.lockutils [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.188 186592 DEBUG oslo_concurrency.lockutils [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.189 186592 DEBUG nova.compute.manager [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] No waiting events found dispatching network-vif-unplugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.189 186592 DEBUG nova.compute.manager [req-ef24c25b-24f5-47f0-81cf-6acfa5d4b486 req-6267bd7c-474d-44e7-9b21-00a9328b9e81 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-unplugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.714 186592 DEBUG nova.compute.manager [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-changed-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.715 186592 DEBUG nova.compute.manager [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Refreshing instance network info cache due to event network-changed-83133bd7-0bf0-46a6-9cda-315762a021e8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.716 186592 DEBUG oslo_concurrency.lockutils [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.716 186592 DEBUG oslo_concurrency.lockutils [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.716 186592 DEBUG nova.network.neutron [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Refreshing network info cache for port 83133bd7-0bf0-46a6-9cda-315762a021e8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.935 186592 DEBUG nova.network.neutron [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:50 compute-0 nova_compute[186588]: 2026-02-26 20:50:50.950 186592 INFO nova.compute.manager [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Took 1.47 seconds to deallocate network for instance.
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.020 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.022 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:51 compute-0 ovn_controller[96598]: 2026-02-26T20:50:51Z|00053|binding|INFO|Releasing lport a2b5842f-50ae-46d2-bb42-c826042f9dfe from this chassis (sb_readonly=0)
Feb 26 20:50:51 compute-0 ovn_controller[96598]: 2026-02-26T20:50:51Z|00054|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:50:51 compute-0 ovn_controller[96598]: 2026-02-26T20:50:51Z|00055|binding|INFO|Releasing lport b0a071c6-ff84-4fda-ba8d-9af2ad9d05c6 from this chassis (sb_readonly=0)
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.071 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.072 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134bea1370>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.078 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance c6227533-c229-4c5d-8090-798e386966a1 from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.086 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.151 186592 DEBUG nova.compute.provider_tree [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.167 186592 DEBUG nova.scheduler.client.report [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.184 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.213 186592 INFO nova.scheduler.client.report [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Deleted allocations for instance 1164f692-eae8-4d3b-8453-9843d5ae0619
Feb 26 20:50:51 compute-0 nova_compute[186588]: 2026-02-26 20:50:51.290 186592 DEBUG oslo_concurrency.lockutils [None req-c15b537c-e14a-4d8b-b907-cba8a4ae9397 f7e7830e29f34940834b0dc390272550 fa50a5195a5249a1aed0159b8d734e3e - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:51.474 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/c6227533-c229-4c5d-8090-798e386966a1 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b3e4473a35ee7cfb8b21c33c4813d695abd797ae73e2596c86aebf485e87031c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Feb 26 20:50:51 compute-0 podman[218475]: 2026-02-26 20:50:51.547676894 +0000 UTC m=+0.063143675 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.311 186592 DEBUG nova.compute.manager [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.311 186592 DEBUG oslo_concurrency.lockutils [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.312 186592 DEBUG oslo_concurrency.lockutils [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.312 186592 DEBUG oslo_concurrency.lockutils [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "1164f692-eae8-4d3b-8453-9843d5ae0619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.312 186592 DEBUG nova.compute.manager [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] No waiting events found dispatching network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.313 186592 WARNING nova.compute.manager [req-a52193f3-718b-428e-8af5-72f33d1bb47e req-a8865135-9982-42bb-8138-d5b431704b5a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received unexpected event network-vif-plugged-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 for instance with vm_state deleted and task_state None.
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.464 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.464 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.465 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.465 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.466 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.467 186592 INFO nova.compute.manager [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Terminating instance
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.467 186592 DEBUG nova.compute.manager [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:50:52 compute-0 kernel: tap1e16d98a-90 (unregistering): left promiscuous mode
Feb 26 20:50:52 compute-0 NetworkManager[56360]: <info>  [1772139052.4962] device (tap1e16d98a-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:50:52 compute-0 ovn_controller[96598]: 2026-02-26T20:50:52Z|00056|binding|INFO|Releasing lport 1e16d98a-902e-4ff9-ba99-475b6eeba3de from this chassis (sb_readonly=0)
Feb 26 20:50:52 compute-0 ovn_controller[96598]: 2026-02-26T20:50:52Z|00057|binding|INFO|Setting lport 1e16d98a-902e-4ff9-ba99-475b6eeba3de down in Southbound
Feb 26 20:50:52 compute-0 ovn_controller[96598]: 2026-02-26T20:50:52Z|00058|binding|INFO|Removing iface tap1e16d98a-90 ovn-installed in OVS
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.501 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.506 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.509 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:72:b9 10.100.0.4'], port_security=['fa:16:3e:3f:72:b9 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b3fa6df3-0cc8-44f5-b1fd-b96469990594', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '779290b5e1b1404b9197ae3c548b298e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '748ec538-28fd-46c5-ba4d-88f67eb179f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.226'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e0c71a4-541f-451c-b68c-f5e9d0b98930, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=1e16d98a-902e-4ff9-ba99-475b6eeba3de) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.512 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.515 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 1e16d98a-902e-4ff9-ba99-475b6eeba3de in datapath 455d5ac8-4ae4-435d-a896-0f99b4c324cc unbound from our chassis
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.517 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455d5ac8-4ae4-435d-a896-0f99b4c324cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.518 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e65c77-3c59-4608-9e34-862e6a2ea2e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.518 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc namespace which is not needed anymore
Feb 26 20:50:52 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 26 20:50:52 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 5.595s CPU time.
Feb 26 20:50:52 compute-0 systemd-machined[155924]: Machine qemu-2-instance-00000001 terminated.
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [NOTICE]   (218082) : haproxy version is 2.8.14-c23fe91
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [NOTICE]   (218082) : path to executable is /usr/sbin/haproxy
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [WARNING]  (218082) : Exiting Master process...
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [WARNING]  (218082) : Exiting Master process...
Feb 26 20:50:52 compute-0 systemd[1]: libpod-efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac.scope: Deactivated successfully.
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [ALERT]    (218082) : Current worker (218084) exited with code 143 (Terminated)
Feb 26 20:50:52 compute-0 neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc[218076]: [WARNING]  (218082) : All workers exited. Exiting... (0)
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.635 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1883 Content-Type: application/json Date: Thu, 26 Feb 2026 20:50:51 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f7ece4f8-34d9-45e6-9219-cfed292d6760 x-openstack-request-id: req-f7ece4f8-34d9-45e6-9219-cfed292d6760 _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.636 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "c6227533-c229-4c5d-8090-798e386966a1", "name": "tempest-ServersTestJSON-server-522947647", "status": "ACTIVE", "tenant_id": "2787e3da42384259a63c344570077339", "user_id": "6d0c2e48334a4be2bd9254c186744540", "metadata": {"hello": "world"}, "hostId": "f75002752593ea41d754d01c5824622838bdb947645de6bc57f70396", "image": {"id": "b79c8674-3f8a-4529-8bd8-8464687ab831", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/b79c8674-3f8a-4529-8bd8-8464687ab831"}]}, "flavor": {"id": "82d482ee-c2f1-4b05-aa1e-0019c8aae3df", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/82d482ee-c2f1-4b05-aa1e-0019c8aae3df"}]}, "created": "2026-02-26T20:50:37Z", "updated": "2026-02-26T20:50:47Z", "addresses": {"tempest-ServersTestJSON-1038687728-network": [{"version": 4, "addr": "10.100.0.11", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:b9:9c:46"}]}, "accessIPv4": "1.1.1.1", "accessIPv6": "::babe:dc0c:1602", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/c6227533-c229-4c5d-8090-798e386966a1"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/c6227533-c229-4c5d-8090-798e386966a1"}], "OS-DCF:diskConfig": "AUTO", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-keypair-182490886", "OS-SRV-USG:launched_at": "2026-02-26T20:50:47.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--2122430422"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000004", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.636 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/c6227533-c229-4c5d-8090-798e386966a1 used request id req-f7ece4f8-34d9-45e6-9219-cfed292d6760 request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Feb 26 20:50:52 compute-0 podman[218526]: 2026-02-26 20:50:52.637351076 +0000 UTC m=+0.044408316 container died efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.637 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c6227533-c229-4c5d-8090-798e386966a1', 'name': 'tempest-ServersTestJSON-server-522947647', 'flavor': {'id': '82d482ee-c2f1-4b05-aa1e-0019c8aae3df', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2787e3da42384259a63c344570077339', 'user_id': '6d0c2e48334a4be2bd9254c186744540', 'hostId': 'f75002752593ea41d754d01c5824622838bdb947645de6bc57f70396', 'status': 'active', 'metadata': {'hello': 'world'}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Feb 26 20:50:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac-userdata-shm.mount: Deactivated successfully.
Feb 26 20:50:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-94776e599f1aacd9fae1ed88daa85fc18d7891a8e5de5ac4a04b3b7886c01fcf-merged.mount: Deactivated successfully.
Feb 26 20:50:52 compute-0 podman[218526]: 2026-02-26 20:50:52.671358493 +0000 UTC m=+0.078415733 container cleanup efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:50:52 compute-0 systemd[1]: libpod-conmon-efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac.scope: Deactivated successfully.
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.716 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance b3fa6df3-0cc8-44f5-b1fd-b96469990594 from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Feb 26 20:50:52 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:52.717 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/b3fa6df3-0cc8-44f5-b1fd-b96469990594 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b3e4473a35ee7cfb8b21c33c4813d695abd797ae73e2596c86aebf485e87031c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.720 186592 INFO nova.virt.libvirt.driver [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Instance destroyed successfully.
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.723 186592 DEBUG nova.objects.instance [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lazy-loading 'resources' on Instance uuid b3fa6df3-0cc8-44f5-b1fd-b96469990594 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.739 186592 DEBUG nova.virt.libvirt.vif [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-314494616',display_name='tempest-ServersTestManualDisk-server-314494616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-314494616',id=1,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDwuZxPMuil7IZ7EvI9UXcF17brGEKItpyjEi6I8YUi0iy7BqrIszwhh1zSUkPSNnRpRKMcui1WDOak9RHzm7x3t3YfIgHDz4rlxymBHHZnEaV5eU+i0/UUKWlvFXG8mig==',key_name='tempest-keypair-632283486',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='779290b5e1b1404b9197ae3c548b298e',ramdisk_id='',reservation_id='r-4jfdqrgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1930113032',owner_user_name='tempest-ServersTestManualDisk-1930113032-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:50:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='abebe541add240948f705a0b2859615f',uuid=b3fa6df3-0cc8-44f5-b1fd-b96469990594,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.740 186592 DEBUG nova.network.os_vif_util [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converting VIF {"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.740 186592 DEBUG nova.network.os_vif_util [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.741 186592 DEBUG os_vif [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.742 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.742 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e16d98a-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.743 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 podman[218555]: 2026-02-26 20:50:52.746117978 +0000 UTC m=+0.055445391 container remove efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.746 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.748 186592 INFO os_vif [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:72:b9,bridge_name='br-int',has_traffic_filtering=True,id=1e16d98a-902e-4ff9-ba99-475b6eeba3de,network=Network(455d5ac8-4ae4-435d-a896-0f99b4c324cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e16d98a-90')
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.749 186592 INFO nova.virt.libvirt.driver [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Deleting instance files /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594_del
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.749 186592 INFO nova.virt.libvirt.driver [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Deletion of /var/lib/nova/instances/b3fa6df3-0cc8-44f5-b1fd-b96469990594_del complete
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.750 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[32e0fa11-0523-4c0d-b4f1-a562b335fbb7]: (4, ('Thu Feb 26 08:50:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc (efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac)\nefb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac\nThu Feb 26 08:50:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc (efb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac)\nefb40e2bcbca726451cd0ff801a2f95fb2ebf572d01009b2ee41b6633a8a12ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.751 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[cd142f07-4039-4f5f-9a7c-a8ec8c36a7de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.752 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455d5ac8-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.754 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 kernel: tap455d5ac8-40: left promiscuous mode
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.761 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[a992d9bd-f6a3-40a9-bed8-d0a8a93ff216]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.763 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.781 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ea72a3c1-e10e-4361-baf0-8d30174006cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.783 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[064eef5f-69b8-44fe-8c87-c3d4113899a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.793 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec22665-7a56-4029-9755-38733d3093fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362859, 'reachable_time': 32454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218587, 'error': None, 'target': 'ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d455d5ac8\x2d4ae4\x2d435d\x2da896\x2d0f99b4c324cc.mount: Deactivated successfully.
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.799 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455d5ac8-4ae4-435d-a896-0f99b4c324cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:50:52 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:52.799 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[81c291b6-db36-4f32-bdea-1eb6c9f16b1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.805 186592 INFO nova.compute.manager [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.806 186592 DEBUG oslo.service.loopingcall [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.806 186592 DEBUG nova.compute.manager [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:50:52 compute-0 nova_compute[186588]: 2026-02-26 20:50:52.806 186592 DEBUG nova.network.neutron [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.123 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Received event network-vif-deleted-f4925885-9f0a-48b5-be05-d81d7ba1d6e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.124 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-changed-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.124 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Refreshing instance network info cache due to event network-changed-1e16d98a-902e-4ff9-ba99-475b6eeba3de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.124 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.124 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.125 186592 DEBUG nova.network.neutron [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Refreshing network info cache for port 1e16d98a-902e-4ff9-ba99-475b6eeba3de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.788 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1901 Content-Type: application/json Date: Thu, 26 Feb 2026 20:50:52 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2d77e370-4780-483e-a8c1-6eee9fb5406a x-openstack-request-id: req-2d77e370-4780-483e-a8c1-6eee9fb5406a _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.788 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "b3fa6df3-0cc8-44f5-b1fd-b96469990594", "name": "tempest-ServersTestManualDisk-server-314494616", "status": "ACTIVE", "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "user_id": "abebe541add240948f705a0b2859615f", "metadata": {"hello": "world"}, "hostId": "b36077a7ce76d92856f1ed8c42424ad2f9279632901869d47c1364f7", "image": {"id": "b79c8674-3f8a-4529-8bd8-8464687ab831", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/b79c8674-3f8a-4529-8bd8-8464687ab831"}]}, "flavor": {"id": "82d482ee-c2f1-4b05-aa1e-0019c8aae3df", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/82d482ee-c2f1-4b05-aa1e-0019c8aae3df"}]}, "created": "2026-02-26T20:50:23Z", "updated": "2026-02-26T20:50:52Z", "addresses": {"tempest-ServersTestManualDisk-2078126051-network": [{"version": 4, "addr": "10.100.0.4", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:3f:72:b9"}]}, "accessIPv4": "1.1.1.1", "accessIPv6": "::babe:dc0c:1602", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/b3fa6df3-0cc8-44f5-b1fd-b96469990594"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/b3fa6df3-0cc8-44f5-b1fd-b96469990594"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-keypair-632283486", "OS-SRV-USG:launched_at": "2026-02-26T20:50:47.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--371513155"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000001", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": "deleting", "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.788 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/b3fa6df3-0cc8-44f5-b1fd-b96469990594 used request id req-2d77e370-4780-483e-a8c1-6eee9fb5406a request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'b3fa6df3-0cc8-44f5-b1fd-b96469990594' (instance-00000001)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager [-] Unable to discover resources: Domain not found: no domain with matching uuid 'b3fa6df3-0cc8-44f5-b1fd-b96469990594' (instance-00000001): libvirt.libvirtError: Domain not found: no domain with matching uuid 'b3fa6df3-0cc8-44f5-b1fd-b96469990594' (instance-00000001)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager Traceback (most recent call last):
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/ceilometer/polling/manager.py", line 959, in discover
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     discovered = discoverer.discover(self, param)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py", line 125, in discover
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     return self.discover_libvirt_polling(manager, param=None)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/tenacity/__init__.py", line 289, in wrapped_f
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     return self(f, *args, **kw)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager            ^^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/tenacity/__init__.py", line 379, in __call__
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     do = self.iter(retry_state=retry_state)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/tenacity/__init__.py", line 314, in iter
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     return fut.result()
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager            ^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.12/concurrent/futures/_base.py", line 449, in result
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     return self.__get_result()
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager            ^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.12/concurrent/futures/_base.py", line 401, in __get_result
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     raise self._exception
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/tenacity/__init__.py", line 382, in __call__
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     result = fn(*args, **kwargs)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager              ^^^^^^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py", line 274, in discover_libvirt_polling
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     dom_state = domain.state()[0]
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager                 ^^^^^^^^^^^^^^
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 3271, in state
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager     raise libvirtError('virDomainGetState() failed')
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager libvirt.libvirtError: Domain not found: no domain with matching uuid 'b3fa6df3-0cc8-44f5-b1fd-b96469990594' (instance-00000001)
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.789 14 ERROR ceilometer.polling.manager 
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.792 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.792 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.794 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c6227533-c229-4c5d-8090-798e386966a1', 'name': 'tempest-ServersTestJSON-server-522947647', 'flavor': {'id': '82d482ee-c2f1-4b05-aa1e-0019c8aae3df', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2787e3da42384259a63c344570077339', 'user_id': '6d0c2e48334a4be2bd9254c186744540', 'hostId': 'f75002752593ea41d754d01c5824622838bdb947645de6bc57f70396', 'status': 'active', 'metadata': {'hello': 'world'}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.796 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance db65189c-3257-4f7c-8407-d99446ead27c from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Feb 26 20:50:53 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:53.796 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/db65189c-3257-4f7c-8407-d99446ead27c -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b3e4473a35ee7cfb8b21c33c4813d695abd797ae73e2596c86aebf485e87031c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Feb 26 20:50:53 compute-0 ovn_controller[96598]: 2026-02-26T20:50:53Z|00059|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:50:53 compute-0 ovn_controller[96598]: 2026-02-26T20:50:53Z|00060|binding|INFO|Releasing lport b0a071c6-ff84-4fda-ba8d-9af2ad9d05c6 from this chassis (sb_readonly=0)
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.857 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.955 186592 DEBUG nova.network.neutron [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updated VIF entry in instance network info cache for port 83133bd7-0bf0-46a6-9cda-315762a021e8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.956 186592 DEBUG nova.network.neutron [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:53 compute-0 nova_compute[186588]: 2026-02-26 20:50:53.986 186592 DEBUG oslo_concurrency.lockutils [req-678e4ba6-ee8d-4508-ad27-7f21d0a40392 req-9f2499ac-9db7-48e7-9e24-619a44d051db d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.448 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1858 Content-Type: application/json Date: Thu, 26 Feb 2026 20:50:53 GMT Keep-Alive: timeout=5, max=98 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cd63aabe-cd4c-4d46-be31-ceaf47838538 x-openstack-request-id: req-cd63aabe-cd4c-4d46-be31-ceaf47838538 _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.449 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "db65189c-3257-4f7c-8407-d99446ead27c", "name": "tempest-ServerActionsTestJSON-server-789364433", "status": "ACTIVE", "tenant_id": "93f63acb614a4c41813a655e2176374f", "user_id": "683dc1563e22496ba81bf3253756023f", "metadata": {}, "hostId": "70dcad6d09ff3b0bb113abf77693354852251b5dd7a0c32b5c9e2c33", "image": {"id": "b79c8674-3f8a-4529-8bd8-8464687ab831", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/b79c8674-3f8a-4529-8bd8-8464687ab831"}]}, "flavor": {"id": "82d482ee-c2f1-4b05-aa1e-0019c8aae3df", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/82d482ee-c2f1-4b05-aa1e-0019c8aae3df"}]}, "created": "2026-02-26T20:50:33Z", "updated": "2026-02-26T20:50:47Z", "addresses": {"tempest-ServerActionsTestJSON-1696189026-network": [{"version": 4, "addr": "10.100.0.14", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:77:0b:72"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/db65189c-3257-4f7c-8407-d99446ead27c"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/db65189c-3257-4f7c-8407-d99446ead27c"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-keypair-883907450", "OS-SRV-USG:launched_at": "2026-02-26T20:50:47.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--1160528425"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000003", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.449 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/db65189c-3257-4f7c-8407-d99446ead27c used request id req-cd63aabe-cd4c-4d46-be31-ceaf47838538 request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.450 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'db65189c-3257-4f7c-8407-d99446ead27c', 'name': 'tempest-ServerActionsTestJSON-server-789364433', 'flavor': {'id': '82d482ee-c2f1-4b05-aa1e-0019c8aae3df', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '93f63acb614a4c41813a655e2176374f', 'user_id': '683dc1563e22496ba81bf3253756023f', 'hostId': '70dcad6d09ff3b0bb113abf77693354852251b5dd7a0c32b5c9e2c33', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.450 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.450 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.451 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.451 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.452 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.452 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-522947647>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-789364433>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-522947647>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-789364433>]
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.453 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-02-26T20:50:54.451260) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.453 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.453 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.453 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.454 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.454 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.454 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-02-26T20:50:54.454192) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.458 14 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c6227533-c229-4c5d-8090-798e386966a1 / tap68e4f67f-e8 inspect_vnics /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/inspector.py:143
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.458 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.462 14 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for db65189c-3257-4f7c-8407-d99446ead27c / tap83133bd7-0b inspect_vnics /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/inspector.py:143
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.462 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.463 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.463 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-02-26T20:50:54.464284) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.464 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.465 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.465 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.465 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.466 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.466 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.466 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.466 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.466 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.467 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.467 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.467 14 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.467 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.468 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.468 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.469 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.469 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.469 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.469 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.469 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-02-26T20:50:54.466263) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-02-26T20:50:54.468200) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.470 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-02-26T20:50:54.470063) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.471 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.472 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-02-26T20:50:54.471719) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.484 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.485 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.497 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.498 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.499 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.499 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-02-26T20:50:54.499046) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.513 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/cpu volume: 6780000000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.528 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/cpu volume: 6940000000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.529 14 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.529 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.529 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-02-26T20:50:54.530210) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.530 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.531 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.532 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.532 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.532 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.532 14 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-02-26T20:50:54.531812) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-02-26T20:50:54.533434) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c6227533-c229-4c5d-8090-798e386966a1: ceilometer.compute.pollsters.NoVolumeException
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.533 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance db65189c-3257-4f7c-8407-d99446ead27c: ceilometer.compute.pollsters.NoVolumeException
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-02-26T20:50:54.534727) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.534 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/power.state volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.535 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/power.state volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.535 14 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.535 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.536 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.537 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.537 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.537 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.537 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-02-26T20:50:54.536288) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-02-26T20:50:54.538338) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.538 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.539 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.540 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-02-26T20:50:54.539698) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.561 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.561 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.583 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.584 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.585 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.586 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.latency volume: 456875195 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.586 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-02-26T20:50:54.585733) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.586 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.latency volume: 1622853 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.586 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.latency volume: 440756830 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.586 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.latency volume: 3489793 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.587 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-02-26T20:50:54.587647) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.588 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.588 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.588 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.588 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-02-26T20:50:54.589491) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.589 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-522947647>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-789364433>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-522947647>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-789364433>]
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-02-26T20:50:54.590438) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.590 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.591 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.591 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.591 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.591 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.591 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-02-26T20:50:54.592206) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.592 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.593 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-02-26T20:50:54.594001) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.594 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-02-26T20:50:54.595233) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.595 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.596 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.597 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-02-26T20:50:54.596960) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.597 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.597 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.597 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.597 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-02-26T20:50:54.598731) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.598 14 DEBUG ceilometer.compute.pollsters [-] c6227533-c229-4c5d-8090-798e386966a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 DEBUG ceilometer.compute.pollsters [-] db65189c-3257-4f7c-8407-d99446ead27c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.599 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.600 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.600 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-02-26T20:50:54.599955) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.600 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.601 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.602 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.603 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:50:54.603 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:50:54 compute-0 nova_compute[186588]: 2026-02-26 20:50:54.632 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.178 186592 DEBUG nova.compute.manager [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-changed-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.178 186592 DEBUG nova.compute.manager [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Refreshing instance network info cache due to event network-changed-68e4f67f-e825-4d68-a244-3a15f7c7b5fc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.178 186592 DEBUG oslo_concurrency.lockutils [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.179 186592 DEBUG oslo_concurrency.lockutils [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.179 186592 DEBUG nova.network.neutron [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Refreshing network info cache for port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.454 186592 DEBUG nova.compute.manager [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.455 186592 DEBUG oslo_concurrency.lockutils [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.455 186592 DEBUG oslo_concurrency.lockutils [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.455 186592 DEBUG oslo_concurrency.lockutils [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.455 186592 DEBUG nova.compute.manager [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] No waiting events found dispatching network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.456 186592 WARNING nova.compute.manager [req-6194d752-f703-4769-832f-f81d4eb172e1 req-e868bcf4-5022-4bcd-9f5b-aedb47eb0e67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received unexpected event network-vif-plugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de for instance with vm_state active and task_state deleting.
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.478 186592 DEBUG nova.network.neutron [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.504 186592 INFO nova.compute.manager [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Took 2.70 seconds to deallocate network for instance.
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.559 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.559 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.649 186592 DEBUG nova.compute.provider_tree [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.664 186592 DEBUG nova.scheduler.client.report [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.693 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.733 186592 INFO nova.scheduler.client.report [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Deleted allocations for instance b3fa6df3-0cc8-44f5-b1fd-b96469990594
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.839 186592 DEBUG oslo_concurrency.lockutils [None req-4de481c2-9395-4a8f-9883-c8a9fca17a82 abebe541add240948f705a0b2859615f 779290b5e1b1404b9197ae3c548b298e - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.910 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.911 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.911 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.911 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.912 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.913 186592 INFO nova.compute.manager [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Terminating instance
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.913 186592 DEBUG nova.compute.manager [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:50:55 compute-0 kernel: tap68e4f67f-e8 (unregistering): left promiscuous mode
Feb 26 20:50:55 compute-0 NetworkManager[56360]: <info>  [1772139055.9506] device (tap68e4f67f-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:50:55 compute-0 ovn_controller[96598]: 2026-02-26T20:50:55Z|00061|binding|INFO|Releasing lport 68e4f67f-e825-4d68-a244-3a15f7c7b5fc from this chassis (sb_readonly=0)
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.955 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:55 compute-0 ovn_controller[96598]: 2026-02-26T20:50:55Z|00062|binding|INFO|Setting lport 68e4f67f-e825-4d68-a244-3a15f7c7b5fc down in Southbound
Feb 26 20:50:55 compute-0 ovn_controller[96598]: 2026-02-26T20:50:55Z|00063|binding|INFO|Removing iface tap68e4f67f-e8 ovn-installed in OVS
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.960 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:55 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:55.964 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:9c:46 10.100.0.11'], port_security=['fa:16:3e:b9:9c:46 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c6227533-c229-4c5d-8090-798e386966a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad05404a-1e87-4e64-8943-380ca32a9699', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2787e3da42384259a63c344570077339', 'neutron:revision_number': '4', 'neutron:security_group_ids': '580e7808-3fb2-47c8-b074-3981ddea0b3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b7a2450-cb62-420f-a43c-7c6813b9bd3a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=68e4f67f-e825-4d68-a244-3a15f7c7b5fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:50:55 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:55.965 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc in datapath ad05404a-1e87-4e64-8943-380ca32a9699 unbound from our chassis
Feb 26 20:50:55 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:55.967 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad05404a-1e87-4e64-8943-380ca32a9699, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:50:55 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:55.968 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ebc95ad6-ee2f-4e5e-8d03-c245565aaff9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:55 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:55.968 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699 namespace which is not needed anymore
Feb 26 20:50:55 compute-0 nova_compute[186588]: 2026-02-26 20:50:55.988 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 26 20:50:56 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 9.104s CPU time.
Feb 26 20:50:56 compute-0 systemd-machined[155924]: Machine qemu-4-instance-00000004 terminated.
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [NOTICE]   (218378) : haproxy version is 2.8.14-c23fe91
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [NOTICE]   (218378) : path to executable is /usr/sbin/haproxy
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [WARNING]  (218378) : Exiting Master process...
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [WARNING]  (218378) : Exiting Master process...
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [ALERT]    (218378) : Current worker (218380) exited with code 143 (Terminated)
Feb 26 20:50:56 compute-0 neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699[218374]: [WARNING]  (218378) : All workers exited. Exiting... (0)
Feb 26 20:50:56 compute-0 systemd[1]: libpod-29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c.scope: Deactivated successfully.
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.101 186592 DEBUG nova.network.neutron [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updated VIF entry in instance network info cache for port 1e16d98a-902e-4ff9-ba99-475b6eeba3de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.102 186592 DEBUG nova.network.neutron [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Updating instance_info_cache with network_info: [{"id": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "address": "fa:16:3e:3f:72:b9", "network": {"id": "455d5ac8-4ae4-435d-a896-0f99b4c324cc", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2078126051-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "779290b5e1b1404b9197ae3c548b298e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e16d98a-90", "ovs_interfaceid": "1e16d98a-902e-4ff9-ba99-475b6eeba3de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:56 compute-0 podman[218612]: 2026-02-26 20:50:56.106093609 +0000 UTC m=+0.042923026 container died 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.131 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.134 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.138 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-b3fa6df3-0cc8-44f5-b1fd-b96469990594" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.139 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-unplugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c-userdata-shm.mount: Deactivated successfully.
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.139 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.140 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.140 186592 DEBUG oslo_concurrency.lockutils [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "b3fa6df3-0cc8-44f5-b1fd-b96469990594-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.140 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] No waiting events found dispatching network-vif-unplugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.141 186592 DEBUG nova.compute.manager [req-ed839f85-3b19-46fe-aa60-bbdd3b61dc17 req-f6808a54-345c-464a-bcad-7bfbdbd14415 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-unplugged-1e16d98a-902e-4ff9-ba99-475b6eeba3de for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:50:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a65425e98fbca288e0d22b302466e07c7943841a6ced5fea43111e454da95de-merged.mount: Deactivated successfully.
Feb 26 20:50:56 compute-0 podman[218612]: 2026-02-26 20:50:56.146422155 +0000 UTC m=+0.083251572 container cleanup 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:50:56 compute-0 systemd[1]: libpod-conmon-29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c.scope: Deactivated successfully.
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.169 186592 INFO nova.virt.libvirt.driver [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] Instance destroyed successfully.
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.170 186592 DEBUG nova.objects.instance [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lazy-loading 'resources' on Instance uuid c6227533-c229-4c5d-8090-798e386966a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.187 186592 DEBUG nova.virt.libvirt.vif [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-522947647',display_name='tempest-ServersTestJSON-server-522947647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-522947647',id=4,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCrPFl+v17Hn6iV8B9R0+nbNbH6QR7ehTrmQvjX99UNq31cmXwGNt5I6fEWKKXTLtkPw/vAp5D1nI6Gl8sX5U1FcxQ+XyzR7S3yu1x+7EGRvVGFByxeXAQAB1Lc9pvDydg==',key_name='tempest-keypair-182490886',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2787e3da42384259a63c344570077339',ramdisk_id='',reservation_id='r-qxetr1bm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1382364450',owner_user_name='tempest-ServersTestJSON-1382364450-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:50:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d0c2e48334a4be2bd9254c186744540',uuid=c6227533-c229-4c5d-8090-798e386966a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.189 186592 DEBUG nova.network.os_vif_util [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converting VIF {"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.190 186592 DEBUG nova.network.os_vif_util [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.190 186592 DEBUG os_vif [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.192 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.192 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68e4f67f-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.194 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.195 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.198 186592 INFO os_vif [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:9c:46,bridge_name='br-int',has_traffic_filtering=True,id=68e4f67f-e825-4d68-a244-3a15f7c7b5fc,network=Network(ad05404a-1e87-4e64-8943-380ca32a9699),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68e4f67f-e8')
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.198 186592 INFO nova.virt.libvirt.driver [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Deleting instance files /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1_del
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.199 186592 INFO nova.virt.libvirt.driver [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Deletion of /var/lib/nova/instances/c6227533-c229-4c5d-8090-798e386966a1_del complete
Feb 26 20:50:56 compute-0 podman[218657]: 2026-02-26 20:50:56.207002821 +0000 UTC m=+0.037758258 container remove 29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.215 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[4900acd7-0030-4ed7-9625-31d00c7b4612]: (4, ('Thu Feb 26 08:50:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699 (29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c)\n29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c\nThu Feb 26 08:50:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699 (29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c)\n29ac747844f4da104a7466bdc460a31a3ec6dfbbed7fdfc16cc2f8d3f3557b6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.217 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[044a0b91-ece0-45f0-a333-54d6b78ae361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.218 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad05404a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.220 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 kernel: tapad05404a-10: left promiscuous mode
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.222 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.229 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f00389-5ed1-4f48-949a-03b39616ed39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.230 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.246 186592 INFO nova.compute.manager [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.247 186592 DEBUG oslo.service.loopingcall [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.247 186592 DEBUG nova.compute.manager [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:50:56 compute-0 nova_compute[186588]: 2026-02-26 20:50:56.247 186592 DEBUG nova.network.neutron [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.249 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[c885ab16-175c-45f8-b313-64befea3865f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.250 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[38c87f20-8d01-4dda-b6c0-475954c46fc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.263 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[adcd64ab-32e7-4eeb-b71e-e859ae91bb5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363240, 'reachable_time': 26214, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218671, 'error': None, 'target': 'ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dad05404a\x2d1e87\x2d4e64\x2d8943\x2d380ca32a9699.mount: Deactivated successfully.
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.268 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad05404a-1e87-4e64-8943-380ca32a9699 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:50:56 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:50:56.269 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[87294ceb-3b92-4fe4-ae43-7d65afb94ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.529 186592 DEBUG nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-unplugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.529 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.529 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.529 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] No waiting events found dispatching network-vif-unplugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-unplugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "c6227533-c229-4c5d-8090-798e386966a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.530 186592 DEBUG oslo_concurrency.lockutils [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.531 186592 DEBUG nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] No waiting events found dispatching network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.531 186592 WARNING nova.compute.manager [req-a5375332-d06d-4b21-8e76-52e3022e0e4b req-abc16d3a-9dec-4c21-99a1-ee4aa1c865ed d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received unexpected event network-vif-plugged-68e4f67f-e825-4d68-a244-3a15f7c7b5fc for instance with vm_state active and task_state deleting.
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.709 186592 DEBUG nova.compute.manager [req-72bfa92e-4679-4a96-90f6-3992f215e1f5 req-6751dc83-e2f7-418b-befb-c04d284aada4 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Received event network-vif-deleted-1e16d98a-902e-4ff9-ba99-475b6eeba3de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.850 186592 DEBUG nova.network.neutron [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updated VIF entry in instance network info cache for port 68e4f67f-e825-4d68-a244-3a15f7c7b5fc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.851 186592 DEBUG nova.network.neutron [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updating instance_info_cache with network_info: [{"id": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "address": "fa:16:3e:b9:9c:46", "network": {"id": "ad05404a-1e87-4e64-8943-380ca32a9699", "bridge": "br-int", "label": "tempest-ServersTestJSON-1038687728-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2787e3da42384259a63c344570077339", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68e4f67f-e8", "ovs_interfaceid": "68e4f67f-e825-4d68-a244-3a15f7c7b5fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.872 186592 DEBUG oslo_concurrency.lockutils [req-0b265a78-b3b1-4e00-bd89-3500010c91e8 req-38706256-76bc-44e4-b084-36aca6802bb6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-c6227533-c229-4c5d-8090-798e386966a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.900 186592 DEBUG nova.network.neutron [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.918 186592 INFO nova.compute.manager [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] Took 1.67 seconds to deallocate network for instance.
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.960 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:50:57 compute-0 nova_compute[186588]: 2026-02-26 20:50:57.960 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:50:58 compute-0 nova_compute[186588]: 2026-02-26 20:50:58.030 186592 DEBUG nova.compute.provider_tree [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:50:58 compute-0 nova_compute[186588]: 2026-02-26 20:50:58.046 186592 DEBUG nova.scheduler.client.report [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:50:58 compute-0 nova_compute[186588]: 2026-02-26 20:50:58.077 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:58 compute-0 nova_compute[186588]: 2026-02-26 20:50:58.140 186592 INFO nova.scheduler.client.report [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Deleted allocations for instance c6227533-c229-4c5d-8090-798e386966a1
Feb 26 20:50:58 compute-0 nova_compute[186588]: 2026-02-26 20:50:58.258 186592 DEBUG oslo_concurrency.lockutils [None req-9e1ebfc5-da74-43b1-a9c9-ae1a2bdee6ff 6d0c2e48334a4be2bd9254c186744540 2787e3da42384259a63c344570077339 - - default default] Lock "c6227533-c229-4c5d-8090-798e386966a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:50:58 compute-0 ovn_controller[96598]: 2026-02-26T20:50:58Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:0b:72 10.100.0.14
Feb 26 20:50:58 compute-0 ovn_controller[96598]: 2026-02-26T20:50:58Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:0b:72 10.100.0.14
Feb 26 20:50:58 compute-0 podman[218693]: 2026-02-26 20:50:58.546684382 +0000 UTC m=+0.054863005 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:50:59 compute-0 ovn_controller[96598]: 2026-02-26T20:50:59Z|00064|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:50:59 compute-0 nova_compute[186588]: 2026-02-26 20:50:59.302 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:59 compute-0 nova_compute[186588]: 2026-02-26 20:50:59.634 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:50:59 compute-0 podman[202527]: time="2026-02-26T20:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:50:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23221 "" "Go-http-client/1.1"
Feb 26 20:50:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3497 "" "Go-http-client/1.1"
Feb 26 20:50:59 compute-0 nova_compute[186588]: 2026-02-26 20:50:59.936 186592 DEBUG nova.compute.manager [req-c2781f89-39d5-40cb-b09a-c336167970cb req-33e92241-0c32-491a-9f88-835941fc291f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: c6227533-c229-4c5d-8090-798e386966a1] Received event network-vif-deleted-68e4f67f-e825-4d68-a244-3a15f7c7b5fc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:01 compute-0 nova_compute[186588]: 2026-02-26 20:51:01.194 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:01 compute-0 openstack_network_exporter[205682]: ERROR   20:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:51:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:51:01 compute-0 openstack_network_exporter[205682]: ERROR   20:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:51:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:51:02 compute-0 ovn_controller[96598]: 2026-02-26T20:51:02Z|00065|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:51:02 compute-0 nova_compute[186588]: 2026-02-26 20:51:02.424 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:02 compute-0 podman[218717]: 2026-02-26 20:51:02.556331826 +0000 UTC m=+0.071639432 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public)
Feb 26 20:51:02 compute-0 ovn_controller[96598]: 2026-02-26T20:51:02Z|00066|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:51:03 compute-0 nova_compute[186588]: 2026-02-26 20:51:03.014 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:04 compute-0 nova_compute[186588]: 2026-02-26 20:51:04.306 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139049.305344, 1164f692-eae8-4d3b-8453-9843d5ae0619 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:04 compute-0 nova_compute[186588]: 2026-02-26 20:51:04.307 186592 INFO nova.compute.manager [-] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] VM Stopped (Lifecycle Event)
Feb 26 20:51:04 compute-0 nova_compute[186588]: 2026-02-26 20:51:04.325 186592 DEBUG nova.compute.manager [None req-6eaf4f37-533c-4046-974e-302ecb1ac9d9 - - - - - -] [instance: 1164f692-eae8-4d3b-8453-9843d5ae0619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:04 compute-0 nova_compute[186588]: 2026-02-26 20:51:04.634 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:04 compute-0 nova_compute[186588]: 2026-02-26 20:51:04.636 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:06 compute-0 nova_compute[186588]: 2026-02-26 20:51:06.248 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:07 compute-0 nova_compute[186588]: 2026-02-26 20:51:07.717 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139052.7162871, b3fa6df3-0cc8-44f5-b1fd-b96469990594 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:07 compute-0 nova_compute[186588]: 2026-02-26 20:51:07.718 186592 INFO nova.compute.manager [-] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] VM Stopped (Lifecycle Event)
Feb 26 20:51:07 compute-0 nova_compute[186588]: 2026-02-26 20:51:07.749 186592 DEBUG nova.compute.manager [None req-c55060ca-e43a-4955-8180-83044a124eaa - - - - - -] [instance: b3fa6df3-0cc8-44f5-b1fd-b96469990594] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:51:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:08.477 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.478 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:08.479 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.700 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.701 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquired lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.701 186592 DEBUG nova.network.neutron [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 26 20:51:08 compute-0 nova_compute[186588]: 2026-02-26 20:51:08.701 186592 DEBUG nova.objects.instance [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lazy-loading 'info_cache' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:09 compute-0 nova_compute[186588]: 2026-02-26 20:51:09.274 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:09 compute-0 nova_compute[186588]: 2026-02-26 20:51:09.636 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:11 compute-0 nova_compute[186588]: 2026-02-26 20:51:11.167 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139056.16577, c6227533-c229-4c5d-8090-798e386966a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:11 compute-0 nova_compute[186588]: 2026-02-26 20:51:11.167 186592 INFO nova.compute.manager [-] [instance: c6227533-c229-4c5d-8090-798e386966a1] VM Stopped (Lifecycle Event)
Feb 26 20:51:11 compute-0 nova_compute[186588]: 2026-02-26 20:51:11.188 186592 DEBUG nova.compute.manager [None req-dce043c7-6e21-4e22-84a5-f95eb871341f - - - - - -] [instance: c6227533-c229-4c5d-8090-798e386966a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:11 compute-0 nova_compute[186588]: 2026-02-26 20:51:11.250 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:12 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:12.483 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.964 186592 DEBUG nova.network.neutron [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.979 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Releasing lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.980 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.980 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.981 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.981 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.982 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.982 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.983 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.983 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 26 20:51:12 compute-0 nova_compute[186588]: 2026-02-26 20:51:12.999 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.012 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.012 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.013 186592 INFO nova.compute.manager [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Rebooting instance
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.076 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.078 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.078 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquired lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.078 186592 DEBUG nova.network.neutron [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.079 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.108 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.109 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.109 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.109 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.192 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.269 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.270 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.323 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.468 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.469 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5431MB free_disk=72.71218872070312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.469 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.469 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.663 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.717 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Instance db65189c-3257-4f7c-8407-d99446ead27c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.717 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:51:14 compute-0 nova_compute[186588]: 2026-02-26 20:51:14.717 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:51:14 compute-0 podman[218746]: 2026-02-26 20:51:14.735536827 +0000 UTC m=+0.049410889 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:51:14 compute-0 podman[218745]: 2026-02-26 20:51:14.74237388 +0000 UTC m=+0.055305707 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 26 20:51:14 compute-0 podman[218747]: 2026-02-26 20:51:14.756678721 +0000 UTC m=+0.066095664 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Feb 26 20:51:15 compute-0 nova_compute[186588]: 2026-02-26 20:51:15.229 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:51:15 compute-0 nova_compute[186588]: 2026-02-26 20:51:15.245 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:51:15 compute-0 nova_compute[186588]: 2026-02-26 20:51:15.270 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:51:15 compute-0 nova_compute[186588]: 2026-02-26 20:51:15.270 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.074 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:51:16 compute-0 nova_compute[186588]: 2026-02-26 20:51:16.254 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.513 186592 DEBUG nova.network.neutron [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.530 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Releasing lock "refresh_cache-db65189c-3257-4f7c-8407-d99446ead27c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.532 186592 DEBUG nova.compute.manager [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:17 compute-0 kernel: tap83133bd7-0b (unregistering): left promiscuous mode
Feb 26 20:51:17 compute-0 NetworkManager[56360]: <info>  [1772139077.6661] device (tap83133bd7-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:51:17 compute-0 ovn_controller[96598]: 2026-02-26T20:51:17Z|00067|binding|INFO|Releasing lport 83133bd7-0bf0-46a6-9cda-315762a021e8 from this chassis (sb_readonly=0)
Feb 26 20:51:17 compute-0 ovn_controller[96598]: 2026-02-26T20:51:17Z|00068|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 down in Southbound
Feb 26 20:51:17 compute-0 ovn_controller[96598]: 2026-02-26T20:51:17Z|00069|binding|INFO|Removing iface tap83133bd7-0b ovn-installed in OVS
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.677 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.691 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.692 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:0b:72 10.100.0.14'], port_security=['fa:16:3e:77:0b:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'db65189c-3257-4f7c-8407-d99446ead27c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8912f988-fb86-4f9a-91d3-d98453103e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93f63acb614a4c41813a655e2176374f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'af61bd30-342c-4238-9c48-29adad8f0e57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4782c29f-d92e-43fa-8dcd-4ddac552e07a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=83133bd7-0bf0-46a6-9cda-315762a021e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.695 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 83133bd7-0bf0-46a6-9cda-315762a021e8 in datapath 8912f988-fb86-4f9a-91d3-d98453103e4e unbound from our chassis
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.698 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8912f988-fb86-4f9a-91d3-d98453103e4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.700 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d58abd5f-259e-41c4-bd93-fd9a4cadedc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.702 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e namespace which is not needed anymore
Feb 26 20:51:17 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 26 20:51:17 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 12.347s CPU time.
Feb 26 20:51:17 compute-0 systemd-machined[155924]: Machine qemu-3-instance-00000003 terminated.
Feb 26 20:51:17 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [NOTICE]   (218185) : haproxy version is 2.8.14-c23fe91
Feb 26 20:51:17 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [NOTICE]   (218185) : path to executable is /usr/sbin/haproxy
Feb 26 20:51:17 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [WARNING]  (218185) : Exiting Master process...
Feb 26 20:51:17 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [ALERT]    (218185) : Current worker (218187) exited with code 143 (Terminated)
Feb 26 20:51:17 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[218181]: [WARNING]  (218185) : All workers exited. Exiting... (0)
Feb 26 20:51:17 compute-0 systemd[1]: libpod-87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e.scope: Deactivated successfully.
Feb 26 20:51:17 compute-0 podman[218832]: 2026-02-26 20:51:17.846371562 +0000 UTC m=+0.050828737 container died 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 26 20:51:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e-userdata-shm.mount: Deactivated successfully.
Feb 26 20:51:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-5af58596274cb205068458f18d8181bd7bc35cae1886cc24509987ce2192fc75-merged.mount: Deactivated successfully.
Feb 26 20:51:17 compute-0 podman[218832]: 2026-02-26 20:51:17.894776793 +0000 UTC m=+0.099233938 container cleanup 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.899 186592 INFO nova.virt.libvirt.driver [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance destroyed successfully.
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.900 186592 DEBUG nova.objects.instance [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'resources' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:17 compute-0 systemd[1]: libpod-conmon-87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e.scope: Deactivated successfully.
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.913 186592 DEBUG nova.virt.libvirt.vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:51:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.913 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.914 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.915 186592 DEBUG os_vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.917 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.918 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83133bd7-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.948 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.951 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.954 186592 INFO os_vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b')
Feb 26 20:51:17 compute-0 podman[218880]: 2026-02-26 20:51:17.955978196 +0000 UTC m=+0.039821673 container remove 87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.958 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d77b8015-1814-416d-a40f-8c4d33d81f76]: (4, ('Thu Feb 26 08:51:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e (87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e)\n87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e\nThu Feb 26 08:51:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e (87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e)\n87f1a1c8206739304bc1698cedec532dff0f779a4245693e09d154ac2affc89e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.960 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[eb99d483-731c-47eb-9173-177c860cddff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.961 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8912f988-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.962 186592 DEBUG nova.virt.libvirt.driver [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start _get_guest_xml network_info=[{"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.963 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 kernel: tap8912f988-f0: left promiscuous mode
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.968 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.969 186592 WARNING nova.virt.libvirt.driver [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.973 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[a2678a7a-cc11-405b-adea-0714e0121e78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.975 186592 DEBUG nova.virt.libvirt.host [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.976 186592 DEBUG nova.virt.libvirt.host [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.980 186592 DEBUG nova.virt.libvirt.host [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.980 186592 DEBUG nova.virt.libvirt.host [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.981 186592 DEBUG nova.virt.libvirt.driver [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.981 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.981 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.981 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.982 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.982 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.982 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.982 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.982 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.983 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.983 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.983 186592 DEBUG nova.virt.hardware [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:51:17 compute-0 nova_compute[186588]: 2026-02-26 20:51:17.983 186592 DEBUG nova.objects.instance [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'vcpu_model' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.986 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc3434e-0ab5-41d8-a586-c353afd3714c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:17 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.988 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[04ef7d14-9b25-4195-9891-d312c2a8b7a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:17.999 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b6593637-6b98-4317-82d0-a2b9ef102955]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362928, 'reachable_time': 30108, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218895, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.002 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.002 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9684b9-b853-401b-a9c2-b36d660af8a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d8912f988\x2dfb86\x2d4f9a\x2d91d3\x2dd98453103e4e.mount: Deactivated successfully.
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.010 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.057 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.058 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.059 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.059 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.061 186592 DEBUG nova.virt.libvirt.vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:51:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.061 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.062 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.063 186592 DEBUG nova.objects.instance [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'pci_devices' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.079 186592 DEBUG nova.virt.libvirt.driver [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <uuid>db65189c-3257-4f7c-8407-d99446ead27c</uuid>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <name>instance-00000003</name>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:name>tempest-ServerActionsTestJSON-server-789364433</nova:name>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:51:17</nova:creationTime>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:user uuid="683dc1563e22496ba81bf3253756023f">tempest-ServerActionsTestJSON-377651542-project-member</nova:user>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:project uuid="93f63acb614a4c41813a655e2176374f">tempest-ServerActionsTestJSON-377651542</nova:project>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         <nova:port uuid="83133bd7-0bf0-46a6-9cda-315762a021e8">
Feb 26 20:51:18 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <system>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="serial">db65189c-3257-4f7c-8407-d99446ead27c</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="uuid">db65189c-3257-4f7c-8407-d99446ead27c</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </system>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <os>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </os>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <features>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </features>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk.config"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:77:0b:72"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <target dev="tap83133bd7-0b"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/console.log" append="off"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <video>
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </video>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <input type="keyboard" bus="usb"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:51:18 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:51:18 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:51:18 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:51:18 compute-0 nova_compute[186588]: </domain>
Feb 26 20:51:18 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.081 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.131 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.133 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.186 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.187 186592 DEBUG nova.objects.instance [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'trusted_certs' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.203 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.260 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.262 186592 DEBUG nova.virt.disk.api [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Checking if we can resize image /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.262 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.344 186592 DEBUG oslo_concurrency.processutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.345 186592 DEBUG nova.virt.disk.api [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Cannot resize image /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.345 186592 DEBUG nova.objects.instance [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'migration_context' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.377 186592 DEBUG nova.virt.libvirt.vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:51:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.378 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.380 186592 DEBUG nova.network.os_vif_util [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.381 186592 DEBUG os_vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.382 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.382 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.383 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.388 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.388 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83133bd7-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.389 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap83133bd7-0b, col_values=(('external_ids', {'iface-id': '83133bd7-0bf0-46a6-9cda-315762a021e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:0b:72', 'vm-uuid': 'db65189c-3257-4f7c-8407-d99446ead27c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.391 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.3927] manager: (tap83133bd7-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.394 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.397 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.399 186592 INFO os_vif [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b')
Feb 26 20:51:18 compute-0 kernel: tap83133bd7-0b: entered promiscuous mode
Feb 26 20:51:18 compute-0 systemd-udevd[218809]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:51:18 compute-0 ovn_controller[96598]: 2026-02-26T20:51:18Z|00070|binding|INFO|Claiming lport 83133bd7-0bf0-46a6-9cda-315762a021e8 for this chassis.
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.4676] manager: (tap83133bd7-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Feb 26 20:51:18 compute-0 ovn_controller[96598]: 2026-02-26T20:51:18Z|00071|binding|INFO|83133bd7-0bf0-46a6-9cda-315762a021e8: Claiming fa:16:3e:77:0b:72 10.100.0.14
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.467 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 ovn_controller[96598]: 2026-02-26T20:51:18Z|00072|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 ovn-installed in OVS
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.473 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.475 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 ovn_controller[96598]: 2026-02-26T20:51:18Z|00073|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 up in Southbound
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.477 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:0b:72 10.100.0.14'], port_security=['fa:16:3e:77:0b:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'db65189c-3257-4f7c-8407-d99446ead27c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8912f988-fb86-4f9a-91d3-d98453103e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93f63acb614a4c41813a655e2176374f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'af61bd30-342c-4238-9c48-29adad8f0e57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4782c29f-d92e-43fa-8dcd-4ddac552e07a, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=83133bd7-0bf0-46a6-9cda-315762a021e8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.4786] device (tap83133bd7-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.4794] device (tap83133bd7-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.479 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 83133bd7-0bf0-46a6-9cda-315762a021e8 in datapath 8912f988-fb86-4f9a-91d3-d98453103e4e bound to our chassis
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.482 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.494 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f4699f04-bdd5-497e-aaec-0c2c9ab605b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.495 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8912f988-f1 in ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:51:18 compute-0 systemd-machined[155924]: New machine qemu-5-instance-00000003.
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.497 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8912f988-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.497 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[44dde8f7-f8f9-4633-bfe2-d4ed3fa569cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.498 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[09f7fa94-bb6c-48b9-bed3-c03dcd9be40f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.508 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[d4296e50-1d89-48ce-952e-cb3724eed56f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000003.
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.519 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[98884f3e-3609-44f6-b67a-c6f68d7e169b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.543 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[2904915f-12e1-405a-8552-a8d0ffca1047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.549 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ac77f562-40ca-4518-a452-2226780eb31c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.5505] manager: (tap8912f988-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.576 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[8c635870-87da-4545-9eae-97e3e1b03b31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.579 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2b4576-a2e4-4420-846a-0905a2a73481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.5976] device (tap8912f988-f0): carrier: link connected
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.600 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[2781f40c-1c57-4c32-a0cb-048fad402086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.616 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[80e53558-b1d3-4546-95ca-4206b8edcf6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8912f988-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:0d:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366515, 'reachable_time': 43144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218958, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.632 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f89b1836-4516-4091-affb-c2c4c0d2ec86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:dc1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366515, 'tstamp': 366515}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218959, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.649 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6e80a6-3841-4bb7-9b44-95db803d685a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8912f988-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:0d:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366515, 'reachable_time': 43144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218960, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.676 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2af143-0783-412b-90eb-daa708978f02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.731 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[803d95c4-0442-4412-9847-a76ae051c8c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.733 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8912f988-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.733 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.734 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8912f988-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 NetworkManager[56360]: <info>  [1772139078.7370] manager: (tap8912f988-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Feb 26 20:51:18 compute-0 kernel: tap8912f988-f0: entered promiscuous mode
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.742 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8912f988-f0, col_values=(('external_ids', {'iface-id': 'feef4d0a-7ad6-4fc7-99f1-0f847997a8be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:18 compute-0 ovn_controller[96598]: 2026-02-26T20:51:18Z|00074|binding|INFO|Releasing lport feef4d0a-7ad6-4fc7-99f1-0f847997a8be from this chassis (sb_readonly=0)
Feb 26 20:51:18 compute-0 nova_compute[186588]: 2026-02-26 20:51:18.744 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.745 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.746 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[30eba4a5-aacc-4648-a2e6-802c662381b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.747 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/8912f988-fb86-4f9a-91d3-d98453103e4e.pid.haproxy
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 8912f988-fb86-4f9a-91d3-d98453103e4e
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:51:18 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:18.748 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'env', 'PROCESS_TAG=haproxy-8912f988-fb86-4f9a-91d3-d98453103e4e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8912f988-fb86-4f9a-91d3-d98453103e4e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:51:19 compute-0 podman[218992]: 2026-02-26 20:51:19.11585744 +0000 UTC m=+0.049792159 container create 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 26 20:51:19 compute-0 systemd[1]: Started libpod-conmon-69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7.scope.
Feb 26 20:51:19 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:51:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd71ed1796b2b6939f6935559eaefcff93f23a8a5253baa31a545105763d4ee9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:51:19 compute-0 podman[218992]: 2026-02-26 20:51:19.096395091 +0000 UTC m=+0.030329820 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:51:19 compute-0 podman[218992]: 2026-02-26 20:51:19.197753325 +0000 UTC m=+0.131688074 container init 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 26 20:51:19 compute-0 podman[218992]: 2026-02-26 20:51:19.206781966 +0000 UTC m=+0.140716685 container start 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 26 20:51:19 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [NOTICE]   (219012) : New worker (219019) forked
Feb 26 20:51:19 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [NOTICE]   (219012) : Loading success.
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.326 186592 DEBUG nova.virt.libvirt.host [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Removed pending event for db65189c-3257-4f7c-8407-d99446ead27c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.327 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139079.3253372, db65189c-3257-4f7c-8407-d99446ead27c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.327 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Resumed (Lifecycle Event)
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.330 186592 DEBUG nova.compute.manager [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.335 186592 INFO nova.virt.libvirt.driver [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance rebooted successfully.
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.336 186592 DEBUG nova.compute.manager [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.389 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.395 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.425 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.426 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139079.326749, db65189c-3257-4f7c-8407-d99446ead27c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.427 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Started (Lifecycle Event)
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.434 186592 DEBUG oslo_concurrency.lockutils [None req-8a9c5888-a168-4c34-8eda-5a4f3aafebbb 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 5.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.444 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.449 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.464 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.465 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.480 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.568 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.569 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.577 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.577 186592 INFO nova.compute.claims [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.665 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.723 186592 DEBUG nova.compute.provider_tree [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.738 186592 DEBUG nova.scheduler.client.report [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.759 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.760 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.799 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.800 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.821 186592 INFO nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.839 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.934 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.935 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.936 186592 INFO nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Creating image(s)
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.937 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.937 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.938 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:19 compute-0 nova_compute[186588]: 2026-02-26 20:51:19.953 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.036 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.037 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.038 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.048 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.097 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.098 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.130 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.131 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.132 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.190 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.192 186592 DEBUG nova.virt.disk.api [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Checking if we can resize image /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.193 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.235 186592 DEBUG nova.policy [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6bdd3e90ca54c35a342ed1197b27c8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9675902f465e4c1c91aa9f01efef2bcd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.247 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.249 186592 DEBUG nova.virt.disk.api [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Cannot resize image /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.250 186592 DEBUG nova.objects.instance [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lazy-loading 'migration_context' on Instance uuid 306c50ba-63e3-498f-8566-5f0bec7c6f16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.280 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.281 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Ensure instance console log exists: /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.282 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.283 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:20 compute-0 nova_compute[186588]: 2026-02-26 20:51:20.283 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:21 compute-0 nova_compute[186588]: 2026-02-26 20:51:21.612 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Successfully created port: e3ce81b9-d6fc-4207-9379-266c99cd8d12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:51:21 compute-0 nova_compute[186588]: 2026-02-26 20:51:21.770 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:22 compute-0 podman[219047]: 2026-02-26 20:51:22.565028781 +0000 UTC m=+0.079354508 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.817 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Successfully updated port: e3ce81b9-d6fc-4207-9379-266c99cd8d12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.830 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.831 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.831 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.958 186592 DEBUG nova.compute.manager [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.959 186592 DEBUG oslo_concurrency.lockutils [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.959 186592 DEBUG oslo_concurrency.lockutils [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.959 186592 DEBUG oslo_concurrency.lockutils [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.960 186592 DEBUG nova.compute.manager [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:22 compute-0 nova_compute[186588]: 2026-02-26 20:51:22.960 186592 WARNING nova.compute.manager [req-9c9e7d8b-7a0a-49a8-a7fa-106d1d554bbf req-539e1adb-fea1-4a3e-97fa-5d5276d21579 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state active and task_state None.
Feb 26 20:51:23 compute-0 nova_compute[186588]: 2026-02-26 20:51:23.095 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:51:23 compute-0 nova_compute[186588]: 2026-02-26 20:51:23.391 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.076 186592 DEBUG nova.network.neutron [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.091 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.091 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Instance network_info: |[{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.093 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Start _get_guest_xml network_info=[{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.097 186592 WARNING nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.100 186592 DEBUG nova.virt.libvirt.host [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.101 186592 DEBUG nova.virt.libvirt.host [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.104 186592 DEBUG nova.virt.libvirt.host [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.104 186592 DEBUG nova.virt.libvirt.host [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.105 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.105 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.105 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.106 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.106 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.106 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.107 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.107 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.108 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.108 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.108 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.108 186592 DEBUG nova.virt.hardware [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.111 186592 DEBUG nova.virt.libvirt.vif [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:51:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2101132382',display_name='tempest-AttachInterfacesUnderV243Test-server-2101132382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2101132382',id=5,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjpvmwPzIY7GfWacos52pyUK7MgQMW6oZwTpV32LBP4pG/tIcrdYeEVFDm0M5iLwayDtke+F6C95ipnP4EGy5v3daqlrfgtsrqzkPS74x9iw9TPmFPnWvd3AxDsFRrUyw==',key_name='tempest-keypair-1091348862',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9675902f465e4c1c91aa9f01efef2bcd',ramdisk_id='',reservation_id='r-9okq05ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-61122046',owner_user_name='tempest-AttachInterfacesUnderV243Test-61122046-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:51:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e6bdd3e90ca54c35a342ed1197b27c8f',uuid=306c50ba-63e3-498f-8566-5f0bec7c6f16,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.112 186592 DEBUG nova.network.os_vif_util [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converting VIF {"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.112 186592 DEBUG nova.network.os_vif_util [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.113 186592 DEBUG nova.objects.instance [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lazy-loading 'pci_devices' on Instance uuid 306c50ba-63e3-498f-8566-5f0bec7c6f16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.131 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <uuid>306c50ba-63e3-498f-8566-5f0bec7c6f16</uuid>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <name>instance-00000005</name>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:name>tempest-AttachInterfacesUnderV243Test-server-2101132382</nova:name>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:51:24</nova:creationTime>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:user uuid="e6bdd3e90ca54c35a342ed1197b27c8f">tempest-AttachInterfacesUnderV243Test-61122046-project-member</nova:user>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:project uuid="9675902f465e4c1c91aa9f01efef2bcd">tempest-AttachInterfacesUnderV243Test-61122046</nova:project>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         <nova:port uuid="e3ce81b9-d6fc-4207-9379-266c99cd8d12">
Feb 26 20:51:24 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <system>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="serial">306c50ba-63e3-498f-8566-5f0bec7c6f16</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="uuid">306c50ba-63e3-498f-8566-5f0bec7c6f16</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </system>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <os>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </os>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <features>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </features>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.config"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:98:67:34"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <target dev="tape3ce81b9-d6"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/console.log" append="off"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <video>
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </video>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:51:24 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:51:24 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:51:24 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:51:24 compute-0 nova_compute[186588]: </domain>
Feb 26 20:51:24 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.135 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Preparing to wait for external event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.135 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.136 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.136 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.137 186592 DEBUG nova.virt.libvirt.vif [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:51:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2101132382',display_name='tempest-AttachInterfacesUnderV243Test-server-2101132382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2101132382',id=5,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjpvmwPzIY7GfWacos52pyUK7MgQMW6oZwTpV32LBP4pG/tIcrdYeEVFDm0M5iLwayDtke+F6C95ipnP4EGy5v3daqlrfgtsrqzkPS74x9iw9TPmFPnWvd3AxDsFRrUyw==',key_name='tempest-keypair-1091348862',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9675902f465e4c1c91aa9f01efef2bcd',ramdisk_id='',reservation_id='r-9okq05ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-61122046',owner_user_name='tempest-AttachInterfacesUnderV243Test-61122046-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:51:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e6bdd3e90ca54c35a342ed1197b27c8f',uuid=306c50ba-63e3-498f-8566-5f0bec7c6f16,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.137 186592 DEBUG nova.network.os_vif_util [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converting VIF {"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.139 186592 DEBUG nova.network.os_vif_util [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.139 186592 DEBUG os_vif [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.140 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.140 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.141 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.143 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.143 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3ce81b9-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.144 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape3ce81b9-d6, col_values=(('external_ids', {'iface-id': 'e3ce81b9-d6fc-4207-9379-266c99cd8d12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:67:34', 'vm-uuid': '306c50ba-63e3-498f-8566-5f0bec7c6f16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.145 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.1466] manager: (tape3ce81b9-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.147 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.150 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.151 186592 INFO os_vif [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6')
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.196 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.197 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.197 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] No VIF found with MAC fa:16:3e:98:67:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.197 186592 INFO nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Using config drive
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.575 186592 INFO nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Creating config drive at /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.config
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.579 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp9hdm1nx6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.726 186592 DEBUG oslo_concurrency.processutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp9hdm1nx6" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.729 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 kernel: tape3ce81b9-d6: entered promiscuous mode
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.7650] manager: (tape3ce81b9-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Feb 26 20:51:24 compute-0 ovn_controller[96598]: 2026-02-26T20:51:24Z|00075|binding|INFO|Claiming lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 for this chassis.
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.771 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 ovn_controller[96598]: 2026-02-26T20:51:24Z|00076|binding|INFO|e3ce81b9-d6fc-4207-9379-266c99cd8d12: Claiming fa:16:3e:98:67:34 10.100.0.7
Feb 26 20:51:24 compute-0 ovn_controller[96598]: 2026-02-26T20:51:24Z|00077|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 ovn-installed in OVS
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.781 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 nova_compute[186588]: 2026-02-26 20:51:24.783 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:24 compute-0 ovn_controller[96598]: 2026-02-26T20:51:24Z|00078|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 up in Southbound
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.788 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:67:34 10.100.0.7'], port_security=['fa:16:3e:98:67:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '306c50ba-63e3-498f-8566-5f0bec7c6f16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b80a050-87c9-4751-99e2-3a99a2801616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9675902f465e4c1c91aa9f01efef2bcd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40aec3d6-8577-468e-aef7-841bb25f2273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65647a3d-64ee-4360-9338-ee166e181aba, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=e3ce81b9-d6fc-4207-9379-266c99cd8d12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.789 105929 INFO neutron.agent.ovn.metadata.agent [-] Port e3ce81b9-d6fc-4207-9379-266c99cd8d12 in datapath 6b80a050-87c9-4751-99e2-3a99a2801616 bound to our chassis
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.791 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b80a050-87c9-4751-99e2-3a99a2801616
Feb 26 20:51:24 compute-0 systemd-machined[155924]: New machine qemu-6-instance-00000005.
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.798 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[622cff42-db21-410c-a83f-059f930c98be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.799 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b80a050-81 in ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.802 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b80a050-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.802 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d069e234-998b-4889-b859-654179a1f386]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.803 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb98c04-2174-4cab-9e71-24dc68221020]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000005.
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.811 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[06ab8e9e-7482-410a-a8c4-0f50663a8ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.818 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[6507fbb0-e387-4437-90fb-6f02e40f14fe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 systemd-udevd[219097]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.8392] device (tape3ce81b9-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.840 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[a95e6a83-a596-443a-8981-5d64ed6ff633]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.8421] device (tape3ce81b9-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.8452] manager: (tap6b80a050-80): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.846 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9e099f16-961a-4192-85e1-4616ba66c218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 systemd-udevd[219100]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.866 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[e12ced55-775c-452b-bd9d-a6997c0c3f6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.868 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e8a627-da3d-48e9-b472-15ecf79b9070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 NetworkManager[56360]: <info>  [1772139084.8826] device (tap6b80a050-80): carrier: link connected
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.884 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff0a067-f50c-4a1d-a29b-9ae3bcc99c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.908 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[3ece3db7-423d-404b-b525-eadaf82cda3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b80a050-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:df:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367144, 'reachable_time': 25541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219126, 'error': None, 'target': 'ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.920 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e1de30-0da2-4f35-9e73-a37f995206da]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe42:df14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367144, 'tstamp': 367144}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219127, 'error': None, 'target': 'ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.931 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[736a66ae-0d69-4358-bac9-bd1496f56a0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b80a050-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:42:df:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367144, 'reachable_time': 25541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219128, 'error': None, 'target': 'ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:24 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:24.960 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecfe1eb-a0e9-4971-a3c0-04d60c10de0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.005 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcb85de-a0af-4731-9242-ffc469207b7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.007 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b80a050-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.007 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.007 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b80a050-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:25 compute-0 NetworkManager[56360]: <info>  [1772139085.0096] manager: (tap6b80a050-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 26 20:51:25 compute-0 kernel: tap6b80a050-80: entered promiscuous mode
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.010 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.017 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b80a050-80, col_values=(('external_ids', {'iface-id': 'b54d5750-df20-4726-81d0-79644d2b6369'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.018 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.019 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:25 compute-0 ovn_controller[96598]: 2026-02-26T20:51:25Z|00079|binding|INFO|Releasing lport b54d5750-df20-4726-81d0-79644d2b6369 from this chassis (sb_readonly=0)
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.024 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.027 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b80a050-87c9-4751-99e2-3a99a2801616.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b80a050-87c9-4751-99e2-3a99a2801616.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.028 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[497757c0-f9b7-468a-87bb-949ae0e3977e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.029 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-6b80a050-87c9-4751-99e2-3a99a2801616
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/6b80a050-87c9-4751-99e2-3a99a2801616.pid.haproxy
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 6b80a050-87c9-4751-99e2-3a99a2801616
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:51:25 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:25.029 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616', 'env', 'PROCESS_TAG=haproxy-6b80a050-87c9-4751-99e2-3a99a2801616', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b80a050-87c9-4751-99e2-3a99a2801616.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.140 186592 DEBUG nova.compute.manager [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.141 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.141 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.141 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.141 186592 DEBUG nova.compute.manager [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.142 186592 WARNING nova.compute.manager [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state active and task_state None.
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.142 186592 DEBUG nova.compute.manager [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.143 186592 DEBUG nova.compute.manager [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing instance network info cache due to event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.143 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.143 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.143 186592 DEBUG nova.network.neutron [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.288 186592 DEBUG nova.compute.manager [req-994509b0-b6a7-4ad5-8ed4-19b0e7e068e9 req-b667db88-40b1-4303-b3e4-261fb9f106bf d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.289 186592 DEBUG oslo_concurrency.lockutils [req-994509b0-b6a7-4ad5-8ed4-19b0e7e068e9 req-b667db88-40b1-4303-b3e4-261fb9f106bf d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.289 186592 DEBUG oslo_concurrency.lockutils [req-994509b0-b6a7-4ad5-8ed4-19b0e7e068e9 req-b667db88-40b1-4303-b3e4-261fb9f106bf d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.289 186592 DEBUG oslo_concurrency.lockutils [req-994509b0-b6a7-4ad5-8ed4-19b0e7e068e9 req-b667db88-40b1-4303-b3e4-261fb9f106bf d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.290 186592 DEBUG nova.compute.manager [req-994509b0-b6a7-4ad5-8ed4-19b0e7e068e9 req-b667db88-40b1-4303-b3e4-261fb9f106bf d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Processing event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:51:25 compute-0 podman[219158]: 2026-02-26 20:51:25.370794357 +0000 UTC m=+0.057147126 container create 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0)
Feb 26 20:51:25 compute-0 systemd[1]: Started libpod-conmon-1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8.scope.
Feb 26 20:51:25 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:51:25 compute-0 podman[219158]: 2026-02-26 20:51:25.331561741 +0000 UTC m=+0.017914530 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:51:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d8ac5abd19887909fb91138dee8c9202452df4a423e1a679eb0c9c096900864/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:51:25 compute-0 podman[219158]: 2026-02-26 20:51:25.445666415 +0000 UTC m=+0.132019204 container init 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:51:25 compute-0 podman[219158]: 2026-02-26 20:51:25.449531158 +0000 UTC m=+0.135883927 container start 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0)
Feb 26 20:51:25 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [NOTICE]   (219175) : New worker (219177) forked
Feb 26 20:51:25 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [NOTICE]   (219175) : Loading success.
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.840 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.841 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139085.841283, 306c50ba-63e3-498f-8566-5f0bec7c6f16 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.842 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] VM Started (Lifecycle Event)
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.844 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.848 186592 INFO nova.virt.libvirt.driver [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Instance spawned successfully.
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.848 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.863 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.868 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.871 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.872 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.872 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.872 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.873 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.873 186592 DEBUG nova.virt.libvirt.driver [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.897 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.897 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139085.842138, 306c50ba-63e3-498f-8566-5f0bec7c6f16 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.898 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] VM Paused (Lifecycle Event)
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.949 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.954 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139085.8444083, 306c50ba-63e3-498f-8566-5f0bec7c6f16 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.955 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] VM Resumed (Lifecycle Event)
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.978 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.982 186592 INFO nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Took 6.05 seconds to spawn the instance on the hypervisor.
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.982 186592 DEBUG nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:25 compute-0 nova_compute[186588]: 2026-02-26 20:51:25.983 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:26 compute-0 nova_compute[186588]: 2026-02-26 20:51:26.010 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:51:26 compute-0 nova_compute[186588]: 2026-02-26 20:51:26.057 186592 INFO nova.compute.manager [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Took 6.52 seconds to build instance.
Feb 26 20:51:26 compute-0 nova_compute[186588]: 2026-02-26 20:51:26.075 186592 DEBUG oslo_concurrency.lockutils [None req-eae9498d-d53c-4821-aea7-d9216b19b299 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.448 186592 DEBUG nova.compute.manager [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.449 186592 DEBUG oslo_concurrency.lockutils [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.449 186592 DEBUG oslo_concurrency.lockutils [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.449 186592 DEBUG oslo_concurrency.lockutils [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.450 186592 DEBUG nova.compute.manager [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:27 compute-0 nova_compute[186588]: 2026-02-26 20:51:27.450 186592 WARNING nova.compute.manager [req-33746509-90e3-4a4d-b571-9b09939e0c0f req-511ae9fe-850f-41d2-97e5-be7dea01ee60 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received unexpected event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with vm_state active and task_state None.
Feb 26 20:51:28 compute-0 nova_compute[186588]: 2026-02-26 20:51:28.371 186592 DEBUG nova.network.neutron [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updated VIF entry in instance network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:51:28 compute-0 nova_compute[186588]: 2026-02-26 20:51:28.373 186592 DEBUG nova.network.neutron [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:28 compute-0 nova_compute[186588]: 2026-02-26 20:51:28.404 186592 DEBUG oslo_concurrency.lockutils [req-51b0448b-80a7-46c7-9857-7d070e88be2e req-83904f9e-6159-40de-94f6-7cdd746d4753 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.083 186592 DEBUG nova.compute.manager [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.084 186592 DEBUG oslo_concurrency.lockutils [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.085 186592 DEBUG oslo_concurrency.lockutils [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.085 186592 DEBUG oslo_concurrency.lockutils [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.086 186592 DEBUG nova.compute.manager [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.086 186592 WARNING nova.compute.manager [req-19489257-97fa-47ac-9abf-f77f1f05becf req-7fab2127-24ce-4663-8e6b-af58135b6ae7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state active and task_state None.
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.087 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.146 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:29 compute-0 podman[219193]: 2026-02-26 20:51:29.544708274 +0000 UTC m=+0.052843691 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 26 20:51:29 compute-0 nova_compute[186588]: 2026-02-26 20:51:29.729 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:29 compute-0 podman[202527]: time="2026-02-26T20:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:51:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 24454 "" "Go-http-client/1.1"
Feb 26 20:51:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3966 "" "Go-http-client/1.1"
Feb 26 20:51:30 compute-0 nova_compute[186588]: 2026-02-26 20:51:30.771 186592 DEBUG nova.compute.manager [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:30 compute-0 nova_compute[186588]: 2026-02-26 20:51:30.771 186592 DEBUG nova.compute.manager [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing instance network info cache due to event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:30 compute-0 nova_compute[186588]: 2026-02-26 20:51:30.772 186592 DEBUG oslo_concurrency.lockutils [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:30 compute-0 nova_compute[186588]: 2026-02-26 20:51:30.772 186592 DEBUG oslo_concurrency.lockutils [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:30 compute-0 nova_compute[186588]: 2026-02-26 20:51:30.772 186592 DEBUG nova.network.neutron [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:51:30 compute-0 ovn_controller[96598]: 2026-02-26T20:51:30Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:0b:72 10.100.0.14
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.410 186592 DEBUG nova.compute.manager [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.410 186592 DEBUG oslo_concurrency.lockutils [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.411 186592 DEBUG oslo_concurrency.lockutils [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.411 186592 DEBUG oslo_concurrency.lockutils [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.411 186592 DEBUG nova.compute.manager [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:31 compute-0 nova_compute[186588]: 2026-02-26 20:51:31.412 186592 WARNING nova.compute.manager [req-af5b3571-a072-40e4-b5fc-d71798681baa req-3dfab146-c4fb-4c8d-9c73-409d46d0f38d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state active and task_state None.
Feb 26 20:51:31 compute-0 openstack_network_exporter[205682]: ERROR   20:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:51:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:51:31 compute-0 openstack_network_exporter[205682]: ERROR   20:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:51:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:51:33 compute-0 podman[219231]: 2026-02-26 20:51:33.541568947 +0000 UTC m=+0.053077927 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, release=1770267347, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 26 20:51:33 compute-0 nova_compute[186588]: 2026-02-26 20:51:33.877 186592 DEBUG nova.network.neutron [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updated VIF entry in instance network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:51:33 compute-0 nova_compute[186588]: 2026-02-26 20:51:33.880 186592 DEBUG nova.network.neutron [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:33 compute-0 nova_compute[186588]: 2026-02-26 20:51:33.897 186592 DEBUG oslo_concurrency.lockutils [req-97f5233d-e48b-4b00-8f59-ac1921bf118b req-10a73bb1-b0e2-4214-ab7d-2fc157383073 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:34 compute-0 nova_compute[186588]: 2026-02-26 20:51:34.148 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:34 compute-0 nova_compute[186588]: 2026-02-26 20:51:34.732 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:36 compute-0 nova_compute[186588]: 2026-02-26 20:51:36.775 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:36 compute-0 ovn_controller[96598]: 2026-02-26T20:51:36Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:67:34 10.100.0.7
Feb 26 20:51:36 compute-0 ovn_controller[96598]: 2026-02-26T20:51:36Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:67:34 10.100.0.7
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.152 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.736 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.900 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.901 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.901 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.901 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.902 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.904 186592 INFO nova.compute.manager [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Terminating instance
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.905 186592 DEBUG nova.compute.manager [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:51:39 compute-0 kernel: tap83133bd7-0b (unregistering): left promiscuous mode
Feb 26 20:51:39 compute-0 NetworkManager[56360]: <info>  [1772139099.9489] device (tap83133bd7-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:51:39 compute-0 ovn_controller[96598]: 2026-02-26T20:51:39Z|00080|binding|INFO|Releasing lport 83133bd7-0bf0-46a6-9cda-315762a021e8 from this chassis (sb_readonly=0)
Feb 26 20:51:39 compute-0 ovn_controller[96598]: 2026-02-26T20:51:39Z|00081|binding|INFO|Setting lport 83133bd7-0bf0-46a6-9cda-315762a021e8 down in Southbound
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.958 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:39 compute-0 ovn_controller[96598]: 2026-02-26T20:51:39Z|00082|binding|INFO|Removing iface tap83133bd7-0b ovn-installed in OVS
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.963 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:39.969 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:0b:72 10.100.0.14'], port_security=['fa:16:3e:77:0b:72 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'db65189c-3257-4f7c-8407-d99446ead27c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8912f988-fb86-4f9a-91d3-d98453103e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93f63acb614a4c41813a655e2176374f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'af61bd30-342c-4238-9c48-29adad8f0e57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4782c29f-d92e-43fa-8dcd-4ddac552e07a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=83133bd7-0bf0-46a6-9cda-315762a021e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:39.971 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 83133bd7-0bf0-46a6-9cda-315762a021e8 in datapath 8912f988-fb86-4f9a-91d3-d98453103e4e unbound from our chassis
Feb 26 20:51:39 compute-0 nova_compute[186588]: 2026-02-26 20:51:39.970 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:39.973 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8912f988-fb86-4f9a-91d3-d98453103e4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:51:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:39.975 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[5b57c5bf-d4f6-4825-9733-42888f872a6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:39 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:39.975 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e namespace which is not needed anymore
Feb 26 20:51:40 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 26 20:51:40 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000003.scope: Consumed 12.422s CPU time.
Feb 26 20:51:40 compute-0 systemd-machined[155924]: Machine qemu-5-instance-00000003 terminated.
Feb 26 20:51:40 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [NOTICE]   (219012) : haproxy version is 2.8.14-c23fe91
Feb 26 20:51:40 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [NOTICE]   (219012) : path to executable is /usr/sbin/haproxy
Feb 26 20:51:40 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [WARNING]  (219012) : Exiting Master process...
Feb 26 20:51:40 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [ALERT]    (219012) : Current worker (219019) exited with code 143 (Terminated)
Feb 26 20:51:40 compute-0 neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e[219008]: [WARNING]  (219012) : All workers exited. Exiting... (0)
Feb 26 20:51:40 compute-0 systemd[1]: libpod-69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7.scope: Deactivated successfully.
Feb 26 20:51:40 compute-0 podman[219292]: 2026-02-26 20:51:40.116515541 +0000 UTC m=+0.043949904 container died 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223)
Feb 26 20:51:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7-userdata-shm.mount: Deactivated successfully.
Feb 26 20:51:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-cd71ed1796b2b6939f6935559eaefcff93f23a8a5253baa31a545105763d4ee9-merged.mount: Deactivated successfully.
Feb 26 20:51:40 compute-0 podman[219292]: 2026-02-26 20:51:40.160026722 +0000 UTC m=+0.087461045 container cleanup 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.170 186592 INFO nova.virt.libvirt.driver [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Instance destroyed successfully.
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.170 186592 DEBUG nova.objects.instance [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lazy-loading 'resources' on Instance uuid db65189c-3257-4f7c-8407-d99446ead27c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:40 compute-0 systemd[1]: libpod-conmon-69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7.scope: Deactivated successfully.
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.182 186592 DEBUG nova.virt.libvirt.vif [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:50:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-789364433',display_name='tempest-ServerActionsTestJSON-server-789364433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-789364433',id=3,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKv4jZf/gJ2i41HbUx/UjYlMvLbOCl3KavS3raWK/kJbvOt949QnmXz4hRwBuj0ze7kGjLYbQ3QIBJLoNUIWmSkp5hXwN3v7JqVnHHG54WXxS3hNZgMcy8Kc47SEFtrOtQ==',key_name='tempest-keypair-883907450',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:50:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='93f63acb614a4c41813a655e2176374f',ramdisk_id='',reservation_id='r-57kan9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-377651542',owner_user_name='tempest-ServerActionsTestJSON-377651542-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:51:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='683dc1563e22496ba81bf3253756023f',uuid=db65189c-3257-4f7c-8407-d99446ead27c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.182 186592 DEBUG nova.network.os_vif_util [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converting VIF {"id": "83133bd7-0bf0-46a6-9cda-315762a021e8", "address": "fa:16:3e:77:0b:72", "network": {"id": "8912f988-fb86-4f9a-91d3-d98453103e4e", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1696189026-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93f63acb614a4c41813a655e2176374f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83133bd7-0b", "ovs_interfaceid": "83133bd7-0bf0-46a6-9cda-315762a021e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.183 186592 DEBUG nova.network.os_vif_util [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.183 186592 DEBUG os_vif [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.186 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.186 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83133bd7-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.188 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.189 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.191 186592 INFO os_vif [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:0b:72,bridge_name='br-int',has_traffic_filtering=True,id=83133bd7-0bf0-46a6-9cda-315762a021e8,network=Network(8912f988-fb86-4f9a-91d3-d98453103e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83133bd7-0b')
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.192 186592 INFO nova.virt.libvirt.driver [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Deleting instance files /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c_del
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.193 186592 INFO nova.virt.libvirt.driver [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Deletion of /var/lib/nova/instances/db65189c-3257-4f7c-8407-d99446ead27c_del complete
Feb 26 20:51:40 compute-0 podman[219339]: 2026-02-26 20:51:40.223871583 +0000 UTC m=+0.043430488 container remove 69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.228 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b6217713-f470-464d-8be8-2dd1a3647d4f]: (4, ('Thu Feb 26 08:51:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e (69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7)\n69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7\nThu Feb 26 08:51:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e (69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7)\n69c90580371543c8fb9cc7fb8882a61891126656b0ab95e99dcf93b1804a45e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.231 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[00ec42a6-8d4d-4ffd-bef4-ff52f2ee2b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.232 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8912f988-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:40 compute-0 kernel: tap8912f988-f0: left promiscuous mode
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.235 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.238 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.241 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[73c9f6b1-52dd-454c-9b60-e8ed1a65ff14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.256 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e558252b-2c38-40e1-a763-42730cb54181]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.257 186592 INFO nova.compute.manager [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.258 186592 DEBUG oslo.service.loopingcall [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.258 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e5edd623-8572-4930-902f-874489119c77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.260 186592 DEBUG nova.compute.manager [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.260 186592 DEBUG nova.network.neutron [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.271 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[30dbf9b3-c3cd-4719-bdcf-b601af2fda4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366509, 'reachable_time': 41451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219354, 'error': None, 'target': 'ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d8912f988\x2dfb86\x2d4f9a\x2d91d3\x2dd98453103e4e.mount: Deactivated successfully.
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.275 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8912f988-fb86-4f9a-91d3-d98453103e4e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:51:40 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:40.275 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[3186961d-d378-4736-b79d-07e9cd7592a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.298 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.298 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.318 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.411 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.411 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.421 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.422 186592 INFO nova.compute.claims [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.447 186592 DEBUG nova.compute.manager [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.447 186592 DEBUG oslo_concurrency.lockutils [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.447 186592 DEBUG oslo_concurrency.lockutils [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.448 186592 DEBUG oslo_concurrency.lockutils [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.448 186592 DEBUG nova.compute.manager [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.448 186592 DEBUG nova.compute.manager [req-a19970cd-6d75-4d6c-a434-78c28b211407 req-301ae5a1-1b38-4188-9ea1-65296c95abdd d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-unplugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.584 186592 DEBUG nova.compute.provider_tree [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.602 186592 DEBUG nova.scheduler.client.report [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.638 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.639 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.703 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.704 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.721 186592 INFO nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.750 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.865 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.867 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.868 186592 INFO nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Creating image(s)
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.869 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.870 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.871 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.895 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.938 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.940 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.941 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.965 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:40 compute-0 nova_compute[186588]: 2026-02-26 20:51:40.980 186592 DEBUG nova.policy [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58204d2871684f63a7ba6a9f725d5791', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ca101c060f24e0da4913194059f2284', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.012 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.013 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.061 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.062 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.063 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.128 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.129 186592 DEBUG nova.virt.disk.api [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Checking if we can resize image /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.129 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.187 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.188 186592 DEBUG nova.virt.disk.api [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Cannot resize image /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.188 186592 DEBUG nova.objects.instance [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'migration_context' on Instance uuid ee73d279-95d6-412b-a16e-4d435d4d4445 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.208 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.209 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Ensure instance console log exists: /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.209 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.209 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:41 compute-0 nova_compute[186588]: 2026-02-26 20:51:41.210 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.345 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Successfully created port: 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.561 186592 DEBUG nova.compute.manager [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.562 186592 DEBUG oslo_concurrency.lockutils [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "db65189c-3257-4f7c-8407-d99446ead27c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.562 186592 DEBUG oslo_concurrency.lockutils [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.563 186592 DEBUG oslo_concurrency.lockutils [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.563 186592 DEBUG nova.compute.manager [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] No waiting events found dispatching network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.564 186592 WARNING nova.compute.manager [req-8de53f3d-630a-407b-b457-81296b296327 req-e39d9aa1-0f03-4d6d-8a02-fecf3a895e9d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received unexpected event network-vif-plugged-83133bd7-0bf0-46a6-9cda-315762a021e8 for instance with vm_state active and task_state deleting.
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.662 186592 DEBUG nova.network.neutron [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.738 186592 INFO nova.compute.manager [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Took 2.48 seconds to deallocate network for instance.
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.808 186592 DEBUG nova.compute.manager [req-559261cd-ec94-49d1-9b5a-0ee31da54414 req-7404d56a-350f-48e6-9b44-739c17d24881 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Received event network-vif-deleted-83133bd7-0bf0-46a6-9cda-315762a021e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.809 186592 INFO nova.compute.manager [req-559261cd-ec94-49d1-9b5a-0ee31da54414 req-7404d56a-350f-48e6-9b44-739c17d24881 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Neutron deleted interface 83133bd7-0bf0-46a6-9cda-315762a021e8; detaching it from the instance and deleting it from the info cache
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.809 186592 DEBUG nova.network.neutron [req-559261cd-ec94-49d1-9b5a-0ee31da54414 req-7404d56a-350f-48e6-9b44-739c17d24881 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.932 186592 DEBUG nova.compute.manager [req-559261cd-ec94-49d1-9b5a-0ee31da54414 req-7404d56a-350f-48e6-9b44-739c17d24881 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Detach interface failed, port_id=83133bd7-0bf0-46a6-9cda-315762a021e8, reason: Instance db65189c-3257-4f7c-8407-d99446ead27c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.956 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:42 compute-0 nova_compute[186588]: 2026-02-26 20:51:42.957 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.038 186592 DEBUG nova.compute.provider_tree [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.124 186592 DEBUG nova.scheduler.client.report [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.191 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.251 186592 INFO nova.scheduler.client.report [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Deleted allocations for instance db65189c-3257-4f7c-8407-d99446ead27c
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.323 186592 DEBUG oslo_concurrency.lockutils [None req-1c22d213-86ef-43e0-a8a1-d44a4c162b7f 683dc1563e22496ba81bf3253756023f 93f63acb614a4c41813a655e2176374f - - default default] Lock "db65189c-3257-4f7c-8407-d99446ead27c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.923 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Successfully updated port: 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.951 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.951 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquired lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:43 compute-0 nova_compute[186588]: 2026-02-26 20:51:43.952 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:51:44 compute-0 nova_compute[186588]: 2026-02-26 20:51:44.174 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:51:44 compute-0 nova_compute[186588]: 2026-02-26 20:51:44.652 186592 DEBUG nova.compute.manager [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:44 compute-0 nova_compute[186588]: 2026-02-26 20:51:44.653 186592 DEBUG nova.compute.manager [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing instance network info cache due to event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:44 compute-0 nova_compute[186588]: 2026-02-26 20:51:44.653 186592 DEBUG oslo_concurrency.lockutils [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:44 compute-0 nova_compute[186588]: 2026-02-26 20:51:44.781 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.178 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.188 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.214 186592 DEBUG nova.network.neutron [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.245 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Releasing lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.246 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Instance network_info: |[{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.247 186592 DEBUG oslo_concurrency.lockutils [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.247 186592 DEBUG nova.network.neutron [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.253 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Start _get_guest_xml network_info=[{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.260 186592 WARNING nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.270 186592 DEBUG nova.virt.libvirt.host [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.271 186592 DEBUG nova.virt.libvirt.host [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.274 186592 DEBUG nova.virt.libvirt.host [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.275 186592 DEBUG nova.virt.libvirt.host [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.276 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.276 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.277 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.278 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.278 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.278 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.279 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.279 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.280 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.280 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.280 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.281 186592 DEBUG nova.virt.hardware [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.287 186592 DEBUG nova.virt.libvirt.vif [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:51:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1999762607',display_name='tempest-TestNetworkBasicOps-server-1999762607',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1999762607',id=6,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOJpgzvFCBhKJA77kRP/y+2d1DQj9wIju+uMLwWW0Hqrnj4aub2UN6IfQq9Z5Mg3FKkr0KAtuere/W3G+BQNfRX8aTL62NPj2Jgxj/6WX+hN7XQ9xQVtinrV9qxNSqLWTA==',key_name='tempest-TestNetworkBasicOps-130601821',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-1hfx1t60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:51:40Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=ee73d279-95d6-412b-a16e-4d435d4d4445,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.288 186592 DEBUG nova.network.os_vif_util [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.289 186592 DEBUG nova.network.os_vif_util [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.291 186592 DEBUG nova.objects.instance [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee73d279-95d6-412b-a16e-4d435d4d4445 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.308 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <uuid>ee73d279-95d6-412b-a16e-4d435d4d4445</uuid>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <name>instance-00000006</name>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:name>tempest-TestNetworkBasicOps-server-1999762607</nova:name>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:51:45</nova:creationTime>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:user uuid="58204d2871684f63a7ba6a9f725d5791">tempest-TestNetworkBasicOps-280950565-project-member</nova:user>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:project uuid="6ca101c060f24e0da4913194059f2284">tempest-TestNetworkBasicOps-280950565</nova:project>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         <nova:port uuid="7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4">
Feb 26 20:51:45 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <system>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="serial">ee73d279-95d6-412b-a16e-4d435d4d4445</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="uuid">ee73d279-95d6-412b-a16e-4d435d4d4445</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </system>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <os>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </os>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <features>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </features>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.config"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:4b:16:c7"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <target dev="tap7841dfe2-eb"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/console.log" append="off"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <video>
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </video>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:51:45 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:51:45 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:51:45 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:51:45 compute-0 nova_compute[186588]: </domain>
Feb 26 20:51:45 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.308 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Preparing to wait for external event network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.309 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.309 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.310 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.311 186592 DEBUG nova.virt.libvirt.vif [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:51:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1999762607',display_name='tempest-TestNetworkBasicOps-server-1999762607',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1999762607',id=6,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOJpgzvFCBhKJA77kRP/y+2d1DQj9wIju+uMLwWW0Hqrnj4aub2UN6IfQq9Z5Mg3FKkr0KAtuere/W3G+BQNfRX8aTL62NPj2Jgxj/6WX+hN7XQ9xQVtinrV9qxNSqLWTA==',key_name='tempest-TestNetworkBasicOps-130601821',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-1hfx1t60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:51:40Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=ee73d279-95d6-412b-a16e-4d435d4d4445,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.312 186592 DEBUG nova.network.os_vif_util [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.313 186592 DEBUG nova.network.os_vif_util [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.313 186592 DEBUG os_vif [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.314 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.315 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.315 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.319 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.320 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7841dfe2-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.320 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7841dfe2-eb, col_values=(('external_ids', {'iface-id': '7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:16:c7', 'vm-uuid': 'ee73d279-95d6-412b-a16e-4d435d4d4445'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.323 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 NetworkManager[56360]: <info>  [1772139105.3247] manager: (tap7841dfe2-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.327 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.330 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.333 186592 INFO os_vif [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb')
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.404 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.404 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.405 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No VIF found with MAC fa:16:3e:4b:16:c7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:51:45 compute-0 nova_compute[186588]: 2026-02-26 20:51:45.405 186592 INFO nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Using config drive
Feb 26 20:51:45 compute-0 podman[219375]: 2026-02-26 20:51:45.459873538 +0000 UTC m=+0.071785736 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_id=ceilometer_agent_compute)
Feb 26 20:51:45 compute-0 podman[219373]: 2026-02-26 20:51:45.467759584 +0000 UTC m=+0.082334962 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:51:45 compute-0 podman[219374]: 2026-02-26 20:51:45.471158652 +0000 UTC m=+0.086310965 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.148 186592 INFO nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Creating config drive at /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.config
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.152 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpcaav3odf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.276 186592 DEBUG oslo_concurrency.processutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpcaav3odf" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:51:46 compute-0 kernel: tap7841dfe2-eb: entered promiscuous mode
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.3246] manager: (tap7841dfe2-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Feb 26 20:51:46 compute-0 ovn_controller[96598]: 2026-02-26T20:51:46Z|00083|binding|INFO|Claiming lport 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 for this chassis.
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.327 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 ovn_controller[96598]: 2026-02-26T20:51:46Z|00084|binding|INFO|7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4: Claiming fa:16:3e:4b:16:c7 10.100.0.4
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.334 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:16:c7 10.100.0.4'], port_security=['fa:16:3e:4b:16:c7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ee73d279-95d6-412b-a16e-4d435d4d4445', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73a224a0-91c7-45a0-a00c-65db0bb99179', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ca101c060f24e0da4913194059f2284', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd8e56c35-cecd-4c9b-9c6d-759a1e86e218', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af54d277-9f4b-4357-988a-5344dd201d7a, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.335 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 in datapath 73a224a0-91c7-45a0-a00c-65db0bb99179 bound to our chassis
Feb 26 20:51:46 compute-0 ovn_controller[96598]: 2026-02-26T20:51:46Z|00085|binding|INFO|Setting lport 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 ovn-installed in OVS
Feb 26 20:51:46 compute-0 ovn_controller[96598]: 2026-02-26T20:51:46Z|00086|binding|INFO|Setting lport 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 up in Southbound
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.337 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 73a224a0-91c7-45a0-a00c-65db0bb99179
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.337 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.343 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.347 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[eb41cdaa-6770-4df1-be75-c51de9b3e9cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.349 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap73a224a0-91 in ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.352 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap73a224a0-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.352 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ade98ede-d372-4cf9-a45f-3d164c1679f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.354 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[49e8c022-5999-41ac-ac29-7af9695496f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 systemd-udevd[219449]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:51:46 compute-0 systemd-machined[155924]: New machine qemu-7-instance-00000006.
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.366 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[a83533f8-ecb7-4c35-bea5-76cba3071bd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.3675] device (tap7841dfe2-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.3685] device (tap7841dfe2-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:51:46 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000006.
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.383 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e6979f-c0c0-49ec-b991-2ddb37e2b7bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.407 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[92c82d2f-6de1-469b-9796-e29c5c249053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.414 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[2501a444-1eda-4e57-bd3a-c2e1adaeeecf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.4155] manager: (tap73a224a0-90): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Feb 26 20:51:46 compute-0 systemd-udevd[219452]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.437 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[100b0037-2b38-44d3-bd48-af4c5b9ba1b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.442 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad14789-016d-4cfb-ae52-227903b32c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.4682] device (tap73a224a0-90): carrier: link connected
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.472 186592 DEBUG nova.network.neutron [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updated VIF entry in instance network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.473 186592 DEBUG nova.network.neutron [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.476 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8e7a53-2ff4-451f-950c-ff6f73996918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.491 186592 DEBUG oslo_concurrency.lockutils [req-381f3002-c308-496b-851b-7dd4ddbfbd8d req-2dde5114-8c21-4dfc-ae1c-3ba018f7f251 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.497 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[1826dbb0-b215-43c3-a634-edc76c1b1735]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73a224a0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:a5:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369302, 'reachable_time': 24309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219483, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.512 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[2572d034-c947-41a6-9450-87467d97546c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:a537'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369302, 'tstamp': 369302}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219484, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.521 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.522 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.523 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.533 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[4d35ac39-e37f-4d9a-8d78-bad816cd2f8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73a224a0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:a5:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369302, 'reachable_time': 24309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219485, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.556 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e861a42a-052e-4017-a0fb-9d767e5a1555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.613 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[654b80c6-1bc5-498e-9640-3be7ae5fa927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.615 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73a224a0-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.615 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.615 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73a224a0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:46 compute-0 kernel: tap73a224a0-90: entered promiscuous mode
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.617 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 NetworkManager[56360]: <info>  [1772139106.6184] manager: (tap73a224a0-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.626 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap73a224a0-90, col_values=(('external_ids', {'iface-id': 'ec7e2be4-d0ad-4655-b985-1fb327c75eec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:51:46 compute-0 ovn_controller[96598]: 2026-02-26T20:51:46Z|00087|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.627 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.632 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/73a224a0-91c7-45a0-a00c-65db0bb99179.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/73a224a0-91c7-45a0-a00c-65db0bb99179.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.633 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[274787ba-47b8-4627-88f3-bc40fa3d784d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.634 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-73a224a0-91c7-45a0-a00c-65db0bb99179
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/73a224a0-91c7-45a0-a00c-65db0bb99179.pid.haproxy
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 73a224a0-91c7-45a0-a00c-65db0bb99179
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:51:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:51:46.634 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'env', 'PROCESS_TAG=haproxy-73a224a0-91c7-45a0-a00c-65db0bb99179', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/73a224a0-91c7-45a0-a00c-65db0bb99179.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.637 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.672 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139106.6715324, ee73d279-95d6-412b-a16e-4d435d4d4445 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.672 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] VM Started (Lifecycle Event)
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.691 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.696 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139106.6740756, ee73d279-95d6-412b-a16e-4d435d4d4445 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.696 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] VM Paused (Lifecycle Event)
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.712 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.717 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:46 compute-0 nova_compute[186588]: 2026-02-26 20:51:46.743 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:51:46 compute-0 podman[219524]: 2026-02-26 20:51:46.99716323 +0000 UTC m=+0.050710335 container create e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:51:47 compute-0 systemd[1]: Started libpod-conmon-e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754.scope.
Feb 26 20:51:47 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:51:47 compute-0 podman[219524]: 2026-02-26 20:51:46.969483127 +0000 UTC m=+0.023030252 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:51:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a61822ad26c8e6302d881c968f9bd2e96fd7d6ff1defbe12b90712345d985d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:51:47 compute-0 podman[219524]: 2026-02-26 20:51:47.078679208 +0000 UTC m=+0.132226313 container init e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 26 20:51:47 compute-0 podman[219524]: 2026-02-26 20:51:47.082907578 +0000 UTC m=+0.136454663 container start e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223)
Feb 26 20:51:47 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [NOTICE]   (219543) : New worker (219545) forked
Feb 26 20:51:47 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [NOTICE]   (219543) : Loading success.
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.500 186592 DEBUG nova.objects.instance [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lazy-loading 'flavor' on Instance uuid 306c50ba-63e3-498f-8566-5f0bec7c6f16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.556 186592 DEBUG oslo_concurrency.lockutils [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.556 186592 DEBUG oslo_concurrency.lockutils [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.787 186592 DEBUG nova.compute.manager [req-5c38473b-e30c-4a46-be7c-60127fc83f4b req-6a7131d1-7d95-492b-a2dd-538e2e38abcc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.787 186592 DEBUG oslo_concurrency.lockutils [req-5c38473b-e30c-4a46-be7c-60127fc83f4b req-6a7131d1-7d95-492b-a2dd-538e2e38abcc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.787 186592 DEBUG oslo_concurrency.lockutils [req-5c38473b-e30c-4a46-be7c-60127fc83f4b req-6a7131d1-7d95-492b-a2dd-538e2e38abcc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.788 186592 DEBUG oslo_concurrency.lockutils [req-5c38473b-e30c-4a46-be7c-60127fc83f4b req-6a7131d1-7d95-492b-a2dd-538e2e38abcc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.788 186592 DEBUG nova.compute.manager [req-5c38473b-e30c-4a46-be7c-60127fc83f4b req-6a7131d1-7d95-492b-a2dd-538e2e38abcc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Processing event network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.789 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.792 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139108.7927418, ee73d279-95d6-412b-a16e-4d435d4d4445 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.793 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] VM Resumed (Lifecycle Event)
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.795 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.799 186592 INFO nova.virt.libvirt.driver [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Instance spawned successfully.
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.800 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.820 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.826 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.830 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.830 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.831 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.831 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.832 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.832 186592 DEBUG nova.virt.libvirt.driver [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.859 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.888 186592 INFO nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Took 8.02 seconds to spawn the instance on the hypervisor.
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.889 186592 DEBUG nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.956 186592 INFO nova.compute.manager [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Took 8.58 seconds to build instance.
Feb 26 20:51:48 compute-0 nova_compute[186588]: 2026-02-26 20:51:48.975 186592 DEBUG oslo_concurrency.lockutils [None req-a678bdf0-30c0-4847-aa8a-9cc049428e49 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:49 compute-0 nova_compute[186588]: 2026-02-26 20:51:49.784 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.266 186592 DEBUG nova.network.neutron [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.326 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.434 186592 DEBUG nova.compute.manager [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.434 186592 DEBUG nova.compute.manager [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing instance network info cache due to event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.434 186592 DEBUG oslo_concurrency.lockutils [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:50 compute-0 ovn_controller[96598]: 2026-02-26T20:51:50Z|00088|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:51:50 compute-0 ovn_controller[96598]: 2026-02-26T20:51:50Z|00089|binding|INFO|Releasing lport b54d5750-df20-4726-81d0-79644d2b6369 from this chassis (sb_readonly=0)
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.908 186592 DEBUG nova.compute.manager [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.909 186592 DEBUG oslo_concurrency.lockutils [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.909 186592 DEBUG oslo_concurrency.lockutils [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.909 186592 DEBUG oslo_concurrency.lockutils [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.909 186592 DEBUG nova.compute.manager [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] No waiting events found dispatching network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.909 186592 WARNING nova.compute.manager [req-f63ecaa1-60a6-4898-ad8e-804671b6b6cc req-831c7d42-b822-4aa7-9fe6-53856200379d d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received unexpected event network-vif-plugged-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 for instance with vm_state active and task_state None.
Feb 26 20:51:50 compute-0 nova_compute[186588]: 2026-02-26 20:51:50.963 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:52 compute-0 ovn_controller[96598]: 2026-02-26T20:51:52Z|00090|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:51:52 compute-0 ovn_controller[96598]: 2026-02-26T20:51:52Z|00091|binding|INFO|Releasing lport b54d5750-df20-4726-81d0-79644d2b6369 from this chassis (sb_readonly=0)
Feb 26 20:51:52 compute-0 nova_compute[186588]: 2026-02-26 20:51:52.606 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.394 186592 DEBUG nova.network.neutron [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.413 186592 DEBUG oslo_concurrency.lockutils [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.413 186592 DEBUG nova.compute.manager [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.414 186592 DEBUG nova.compute.manager [None req-1b6258c0-bb71-4804-ad0b-60cbbfc4cd25 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] network_info to inject: |[{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.416 186592 DEBUG oslo_concurrency.lockutils [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:53 compute-0 nova_compute[186588]: 2026-02-26 20:51:53.417 186592 DEBUG nova.network.neutron [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:51:53 compute-0 podman[219556]: 2026-02-26 20:51:53.55885187 +0000 UTC m=+0.070824400 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.010 186592 DEBUG nova.compute.manager [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.011 186592 DEBUG nova.compute.manager [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing instance network info cache due to event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.011 186592 DEBUG oslo_concurrency.lockutils [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.011 186592 DEBUG oslo_concurrency.lockutils [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.012 186592 DEBUG nova.network.neutron [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:51:54 compute-0 nova_compute[186588]: 2026-02-26 20:51:54.785 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.166 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139100.1657712, db65189c-3257-4f7c-8407-d99446ead27c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.167 186592 INFO nova.compute.manager [-] [instance: db65189c-3257-4f7c-8407-d99446ead27c] VM Stopped (Lifecycle Event)
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.187 186592 DEBUG nova.compute.manager [None req-1259895f-ac1a-4084-b939-776e006f70a3 - - - - - -] [instance: db65189c-3257-4f7c-8407-d99446ead27c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.327 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.800 186592 DEBUG nova.objects.instance [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lazy-loading 'flavor' on Instance uuid 306c50ba-63e3-498f-8566-5f0bec7c6f16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:51:55 compute-0 nova_compute[186588]: 2026-02-26 20:51:55.829 186592 DEBUG oslo_concurrency.lockutils [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:56 compute-0 nova_compute[186588]: 2026-02-26 20:51:56.685 186592 DEBUG nova.network.neutron [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updated VIF entry in instance network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:51:56 compute-0 nova_compute[186588]: 2026-02-26 20:51:56.686 186592 DEBUG nova.network.neutron [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:56 compute-0 nova_compute[186588]: 2026-02-26 20:51:56.714 186592 DEBUG oslo_concurrency.lockutils [req-25a1fc18-c459-41c7-bb29-44d3c9eccce0 req-1aa63146-d3be-475e-9ba2-db1c8bc7fdf5 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:56 compute-0 nova_compute[186588]: 2026-02-26 20:51:56.715 186592 DEBUG oslo_concurrency.lockutils [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:51:57 compute-0 nova_compute[186588]: 2026-02-26 20:51:57.086 186592 DEBUG nova.network.neutron [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updated VIF entry in instance network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:51:57 compute-0 nova_compute[186588]: 2026-02-26 20:51:57.087 186592 DEBUG nova.network.neutron [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:51:57 compute-0 nova_compute[186588]: 2026-02-26 20:51:57.108 186592 DEBUG oslo_concurrency.lockutils [req-3262de88-1166-447e-b5ab-cc1696806e10 req-8f572771-163a-4e89-9039-24730c617abc d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:51:58 compute-0 nova_compute[186588]: 2026-02-26 20:51:58.219 186592 DEBUG nova.network.neutron [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:51:58 compute-0 nova_compute[186588]: 2026-02-26 20:51:58.550 186592 DEBUG nova.compute.manager [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:51:58 compute-0 nova_compute[186588]: 2026-02-26 20:51:58.550 186592 DEBUG nova.compute.manager [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing instance network info cache due to event network-changed-e3ce81b9-d6fc-4207-9379-266c99cd8d12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:51:58 compute-0 nova_compute[186588]: 2026-02-26 20:51:58.551 186592 DEBUG oslo_concurrency.lockutils [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:51:59 compute-0 podman[202527]: time="2026-02-26T20:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:51:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 24454 "" "Go-http-client/1.1"
Feb 26 20:51:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3969 "" "Go-http-client/1.1"
Feb 26 20:51:59 compute-0 nova_compute[186588]: 2026-02-26 20:51:59.788 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.329 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.350 186592 DEBUG nova.network.neutron [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.371 186592 DEBUG oslo_concurrency.lockutils [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.372 186592 DEBUG nova.compute.manager [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.372 186592 DEBUG nova.compute.manager [None req-fa589351-3181-4dee-83f0-da49ba89d57d e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] network_info to inject: |[{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.374 186592 DEBUG oslo_concurrency.lockutils [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:00 compute-0 nova_compute[186588]: 2026-02-26 20:52:00.374 186592 DEBUG nova.network.neutron [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Refreshing network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:52:00 compute-0 ovn_controller[96598]: 2026-02-26T20:52:00Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4b:16:c7 10.100.0.4
Feb 26 20:52:00 compute-0 ovn_controller[96598]: 2026-02-26T20:52:00Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4b:16:c7 10.100.0.4
Feb 26 20:52:00 compute-0 podman[219598]: 2026-02-26 20:52:00.535610049 +0000 UTC m=+0.046880474 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.140 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.140 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.141 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.141 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.141 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.142 186592 INFO nova.compute.manager [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Terminating instance
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.143 186592 DEBUG nova.compute.manager [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:52:01 compute-0 kernel: tape3ce81b9-d6 (unregistering): left promiscuous mode
Feb 26 20:52:01 compute-0 NetworkManager[56360]: <info>  [1772139121.1631] device (tape3ce81b9-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.167 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00092|binding|INFO|Releasing lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 from this chassis (sb_readonly=0)
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00093|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 down in Southbound
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00094|binding|INFO|Removing iface tape3ce81b9-d6 ovn-installed in OVS
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.177 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 26 20:52:01 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000005.scope: Consumed 13.135s CPU time.
Feb 26 20:52:01 compute-0 systemd-machined[155924]: Machine qemu-6-instance-00000005 terminated.
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.299 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:67:34 10.100.0.7'], port_security=['fa:16:3e:98:67:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '306c50ba-63e3-498f-8566-5f0bec7c6f16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b80a050-87c9-4751-99e2-3a99a2801616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9675902f465e4c1c91aa9f01efef2bcd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40aec3d6-8577-468e-aef7-841bb25f2273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65647a3d-64ee-4360-9338-ee166e181aba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=e3ce81b9-d6fc-4207-9379-266c99cd8d12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.303 105929 INFO neutron.agent.ovn.metadata.agent [-] Port e3ce81b9-d6fc-4207-9379-266c99cd8d12 in datapath 6b80a050-87c9-4751-99e2-3a99a2801616 unbound from our chassis
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.307 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b80a050-87c9-4751-99e2-3a99a2801616, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.308 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e8279305-029f-4850-84ba-4b727deef1d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.310 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616 namespace which is not needed anymore
Feb 26 20:52:01 compute-0 kernel: tape3ce81b9-d6: entered promiscuous mode
Feb 26 20:52:01 compute-0 kernel: tape3ce81b9-d6 (unregistering): left promiscuous mode
Feb 26 20:52:01 compute-0 NetworkManager[56360]: <info>  [1772139121.3676] manager: (tape3ce81b9-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00095|binding|INFO|Claiming lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 for this chassis.
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00096|binding|INFO|e3ce81b9-d6fc-4207-9379-266c99cd8d12: Claiming fa:16:3e:98:67:34 10.100.0.7
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.374 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.382 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:67:34 10.100.0.7'], port_security=['fa:16:3e:98:67:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '306c50ba-63e3-498f-8566-5f0bec7c6f16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b80a050-87c9-4751-99e2-3a99a2801616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9675902f465e4c1c91aa9f01efef2bcd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40aec3d6-8577-468e-aef7-841bb25f2273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65647a3d-64ee-4360-9338-ee166e181aba, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=e3ce81b9-d6fc-4207-9379-266c99cd8d12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00097|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 up in Southbound
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00098|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 ovn-installed in OVS
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.385 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.407 186592 INFO nova.virt.libvirt.driver [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Instance destroyed successfully.
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.409 186592 DEBUG nova.objects.instance [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lazy-loading 'resources' on Instance uuid 306c50ba-63e3-498f-8566-5f0bec7c6f16 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:01 compute-0 openstack_network_exporter[205682]: ERROR   20:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:52:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:52:01 compute-0 openstack_network_exporter[205682]: ERROR   20:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:52:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.425 186592 DEBUG nova.virt.libvirt.vif [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:51:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2101132382',display_name='tempest-AttachInterfacesUnderV243Test-server-2101132382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2101132382',id=5,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjpvmwPzIY7GfWacos52pyUK7MgQMW6oZwTpV32LBP4pG/tIcrdYeEVFDm0M5iLwayDtke+F6C95ipnP4EGy5v3daqlrfgtsrqzkPS74x9iw9TPmFPnWvd3AxDsFRrUyw==',key_name='tempest-keypair-1091348862',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:51:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9675902f465e4c1c91aa9f01efef2bcd',ramdisk_id='',reservation_id='r-9okq05ob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-61122046',owner_user_name='tempest-AttachInterfacesUnderV243Test-61122046-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:52:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e6bdd3e90ca54c35a342ed1197b27c8f',uuid=306c50ba-63e3-498f-8566-5f0bec7c6f16,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.426 186592 DEBUG nova.network.os_vif_util [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converting VIF {"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.427 186592 DEBUG nova.network.os_vif_util [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.428 186592 DEBUG os_vif [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.429 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.429 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3ce81b9-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00099|binding|INFO|Releasing lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 from this chassis (sb_readonly=0)
Feb 26 20:52:01 compute-0 ovn_controller[96598]: 2026-02-26T20:52:01Z|00100|binding|INFO|Setting lport e3ce81b9-d6fc-4207-9379-266c99cd8d12 down in Southbound
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.432 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.437 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.439 186592 INFO os_vif [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:98:67:34,bridge_name='br-int',has_traffic_filtering=True,id=e3ce81b9-d6fc-4207-9379-266c99cd8d12,network=Network(6b80a050-87c9-4751-99e2-3a99a2801616),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3ce81b9-d6')
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.439 186592 INFO nova.virt.libvirt.driver [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Deleting instance files /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16_del
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.440 186592 INFO nova.virt.libvirt.driver [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Deletion of /var/lib/nova/instances/306c50ba-63e3-498f-8566-5f0bec7c6f16_del complete
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.442 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:67:34 10.100.0.7'], port_security=['fa:16:3e:98:67:34 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '306c50ba-63e3-498f-8566-5f0bec7c6f16', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b80a050-87c9-4751-99e2-3a99a2801616', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9675902f465e4c1c91aa9f01efef2bcd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '40aec3d6-8577-468e-aef7-841bb25f2273', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65647a3d-64ee-4360-9338-ee166e181aba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=e3ce81b9-d6fc-4207-9379-266c99cd8d12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:01 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [NOTICE]   (219175) : haproxy version is 2.8.14-c23fe91
Feb 26 20:52:01 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [NOTICE]   (219175) : path to executable is /usr/sbin/haproxy
Feb 26 20:52:01 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [WARNING]  (219175) : Exiting Master process...
Feb 26 20:52:01 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [ALERT]    (219175) : Current worker (219177) exited with code 143 (Terminated)
Feb 26 20:52:01 compute-0 neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616[219171]: [WARNING]  (219175) : All workers exited. Exiting... (0)
Feb 26 20:52:01 compute-0 systemd[1]: libpod-1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8.scope: Deactivated successfully.
Feb 26 20:52:01 compute-0 podman[219656]: 2026-02-26 20:52:01.457657317 +0000 UTC m=+0.056119856 container died 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 26 20:52:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8-userdata-shm.mount: Deactivated successfully.
Feb 26 20:52:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-3d8ac5abd19887909fb91138dee8c9202452df4a423e1a679eb0c9c096900864-merged.mount: Deactivated successfully.
Feb 26 20:52:01 compute-0 podman[219656]: 2026-02-26 20:52:01.491327186 +0000 UTC m=+0.089789705 container cleanup 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:52:01 compute-0 systemd[1]: libpod-conmon-1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8.scope: Deactivated successfully.
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.498 186592 INFO nova.compute.manager [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.499 186592 DEBUG oslo.service.loopingcall [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.501 186592 DEBUG nova.compute.manager [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.501 186592 DEBUG nova.network.neutron [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:52:01 compute-0 podman[219689]: 2026-02-26 20:52:01.558603473 +0000 UTC m=+0.046379463 container remove 1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.563 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[965bc1f4-d89a-4f8f-bcdc-9470397e5ab1]: (4, ('Thu Feb 26 08:52:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616 (1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8)\n1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8\nThu Feb 26 08:52:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616 (1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8)\n1fc9880739bd96875902cfa2a177641568e9965bf78f4dcfc47d9e263ee61aa8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.565 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[47fb6217-5686-4480-bb1b-e389d7cbfa39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.566 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b80a050-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.568 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 kernel: tap6b80a050-80: left promiscuous mode
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.573 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.578 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[5c59f2e3-e297-4bab-a250-6e8ec9e1b492]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.599 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[a4555c0d-19b5-44fb-840e-db7bbd58dda2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.601 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[e622cec7-9e00-4ca8-bc05-cf250c9047a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.619 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b9522476-3e9d-47b0-a220-18f3cc2d1a40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367139, 'reachable_time': 32345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219705, 'error': None, 'target': 'ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.622 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b80a050-87c9-4751-99e2-3a99a2801616 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.622 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[912ea735-a0bc-4cef-81ae-4dbc49cea9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d6b80a050\x2d87c9\x2d4751\x2d99e2\x2d3a99a2801616.mount: Deactivated successfully.
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.623 105929 INFO neutron.agent.ovn.metadata.agent [-] Port e3ce81b9-d6fc-4207-9379-266c99cd8d12 in datapath 6b80a050-87c9-4751-99e2-3a99a2801616 unbound from our chassis
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.624 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b80a050-87c9-4751-99e2-3a99a2801616, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.625 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a6a14a-340e-4635-8ce1-7222b6e0c7cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.626 105929 INFO neutron.agent.ovn.metadata.agent [-] Port e3ce81b9-d6fc-4207-9379-266c99cd8d12 in datapath 6b80a050-87c9-4751-99e2-3a99a2801616 unbound from our chassis
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.627 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b80a050-87c9-4751-99e2-3a99a2801616, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:52:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:01.627 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0280d0-902f-493b-b56e-67b90a429a93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.761 186592 DEBUG nova.compute.manager [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-unplugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.762 186592 DEBUG oslo_concurrency.lockutils [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.762 186592 DEBUG oslo_concurrency.lockutils [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.763 186592 DEBUG oslo_concurrency.lockutils [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.763 186592 DEBUG nova.compute.manager [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-unplugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:01 compute-0 nova_compute[186588]: 2026-02-26 20:52:01.764 186592 DEBUG nova.compute.manager [req-b441324f-9f60-465f-975b-15763fe7412f req-d2335b26-9701-4298-a483-1988766fd32e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-unplugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:52:03 compute-0 nova_compute[186588]: 2026-02-26 20:52:03.015 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:03 compute-0 nova_compute[186588]: 2026-02-26 20:52:03.897 186592 DEBUG nova.network.neutron [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updated VIF entry in instance network info cache for port e3ce81b9-d6fc-4207-9379-266c99cd8d12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:52:03 compute-0 nova_compute[186588]: 2026-02-26 20:52:03.898 186592 DEBUG nova.network.neutron [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [{"id": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "address": "fa:16:3e:98:67:34", "network": {"id": "6b80a050-87c9-4751-99e2-3a99a2801616", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1955728928-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9675902f465e4c1c91aa9f01efef2bcd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3ce81b9-d6", "ovs_interfaceid": "e3ce81b9-d6fc-4207-9379-266c99cd8d12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:03 compute-0 nova_compute[186588]: 2026-02-26 20:52:03.918 186592 DEBUG oslo_concurrency.lockutils [req-d15cc2e4-784c-4ae8-b099-0759402a333e req-62e19412-4e78-49cc-bb0e-a4702aa0c2d6 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-306c50ba-63e3-498f-8566-5f0bec7c6f16" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.030 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.031 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.032 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.032 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.032 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.032 186592 WARNING nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received unexpected event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with vm_state active and task_state deleting.
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.033 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.033 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.033 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.033 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.034 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.034 186592 WARNING nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received unexpected event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with vm_state active and task_state deleting.
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.034 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.034 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.035 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.035 186592 DEBUG oslo_concurrency.lockutils [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.036 186592 DEBUG nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.036 186592 WARNING nova.compute.manager [req-33ff631b-6521-48b9-995b-de580c2c7370 req-6bbb94b6-febf-4556-8187-e5e2b4181c92 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received unexpected event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with vm_state active and task_state deleting.
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.064 186592 DEBUG nova.network.neutron [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.095 186592 INFO nova.compute.manager [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Took 2.59 seconds to deallocate network for instance.
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.157 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.157 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.305 186592 DEBUG nova.compute.provider_tree [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.325 186592 DEBUG nova.scheduler.client.report [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.362 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.404 186592 INFO nova.scheduler.client.report [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Deleted allocations for instance 306c50ba-63e3-498f-8566-5f0bec7c6f16
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.497 186592 DEBUG oslo_concurrency.lockutils [None req-3873783a-ad38-4d36-b52d-f60625aaa707 e6bdd3e90ca54c35a342ed1197b27c8f 9675902f465e4c1c91aa9f01efef2bcd - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:04 compute-0 podman[219708]: 2026-02-26 20:52:04.549160533 +0000 UTC m=+0.063493689 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Feb 26 20:52:04 compute-0 nova_compute[186588]: 2026-02-26 20:52:04.790 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:05 compute-0 sshd-session[219706]: Received disconnect from 124.163.255.210 port 63767:11:  [preauth]
Feb 26 20:52:05 compute-0 sshd-session[219706]: Disconnected from authenticating user root 124.163.255.210 port 63767 [preauth]
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.077 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.308 186592 DEBUG nova.compute.manager [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.310 186592 DEBUG oslo_concurrency.lockutils [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.310 186592 DEBUG oslo_concurrency.lockutils [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.311 186592 DEBUG oslo_concurrency.lockutils [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "306c50ba-63e3-498f-8566-5f0bec7c6f16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.311 186592 DEBUG nova.compute.manager [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] No waiting events found dispatching network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.311 186592 WARNING nova.compute.manager [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received unexpected event network-vif-plugged-e3ce81b9-d6fc-4207-9379-266c99cd8d12 for instance with vm_state deleted and task_state None.
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.311 186592 DEBUG nova.compute.manager [req-2e468939-2625-43ca-977e-21480f01ca98 req-a40a660e-97a9-4cde-8df8-d1bf625ced0c d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Received event network-vif-deleted-e3ce81b9-d6fc-4207-9379-266c99cd8d12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.432 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.693 186592 INFO nova.compute.manager [None req-ad5e7ffa-0285-4b17-bda1-fafcbb6e2c5c 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Get console output
Feb 26 20:52:06 compute-0 nova_compute[186588]: 2026-02-26 20:52:06.784 217717 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 26 20:52:07 compute-0 nova_compute[186588]: 2026-02-26 20:52:07.401 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.059 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.363 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.363 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquired lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.363 186592 DEBUG nova.network.neutron [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.364 186592 DEBUG nova.objects.instance [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ee73d279-95d6-412b-a16e-4d435d4d4445 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:08.892 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:08 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:08.892 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:52:08 compute-0 nova_compute[186588]: 2026-02-26 20:52:08.893 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:09 compute-0 nova_compute[186588]: 2026-02-26 20:52:09.443 186592 DEBUG nova.compute.manager [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:09 compute-0 nova_compute[186588]: 2026-02-26 20:52:09.444 186592 DEBUG nova.compute.manager [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing instance network info cache due to event network-changed-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:52:09 compute-0 nova_compute[186588]: 2026-02-26 20:52:09.444 186592 DEBUG oslo_concurrency.lockutils [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:52:09 compute-0 nova_compute[186588]: 2026-02-26 20:52:09.791 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.728 186592 DEBUG nova.network.neutron [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.759 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Releasing lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.759 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.760 186592 DEBUG oslo_concurrency.lockutils [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.760 186592 DEBUG nova.network.neutron [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Refreshing network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:52:10 compute-0 nova_compute[186588]: 2026-02-26 20:52:10.761 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:11 compute-0 nova_compute[186588]: 2026-02-26 20:52:11.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:11 compute-0 nova_compute[186588]: 2026-02-26 20:52:11.117 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:11 compute-0 nova_compute[186588]: 2026-02-26 20:52:11.435 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.055 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.492 186592 DEBUG nova.network.neutron [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updated VIF entry in instance network info cache for port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.493 186592 DEBUG nova.network.neutron [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [{"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.510 186592 DEBUG oslo_concurrency.lockutils [req-444f1453-df95-4c11-90f1-680cdfad8502 req-c6e71d1e-ddb5-4e3d-8dd4-fbb1f808ad2e d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-ee73d279-95d6-412b-a16e-4d435d4d4445" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:12 compute-0 ovn_controller[96598]: 2026-02-26T20:52:12Z|00101|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.640 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:12 compute-0 ovn_controller[96598]: 2026-02-26T20:52:12Z|00102|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:52:12 compute-0 nova_compute[186588]: 2026-02-26 20:52:12.683 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:13 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:13.894 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:14 compute-0 nova_compute[186588]: 2026-02-26 20:52:14.793 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.091 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.091 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.092 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.092 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.165 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.227 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.228 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.282 186592 DEBUG oslo_concurrency.processutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.390 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.392 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5477MB free_disk=72.7118911743164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.392 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.392 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.469 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Instance ee73d279-95d6-412b-a16e-4d435d4d4445 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.470 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.470 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.522 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.548 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.573 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:52:15 compute-0 nova_compute[186588]: 2026-02-26 20:52:15.573 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:16 compute-0 nova_compute[186588]: 2026-02-26 20:52:16.406 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139121.4052346, 306c50ba-63e3-498f-8566-5f0bec7c6f16 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:52:16 compute-0 nova_compute[186588]: 2026-02-26 20:52:16.407 186592 INFO nova.compute.manager [-] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] VM Stopped (Lifecycle Event)
Feb 26 20:52:16 compute-0 nova_compute[186588]: 2026-02-26 20:52:16.431 186592 DEBUG nova.compute.manager [None req-9fa162f0-ca4d-498f-886b-ea0e3d536f01 - - - - - -] [instance: 306c50ba-63e3-498f-8566-5f0bec7c6f16] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:52:16 compute-0 nova_compute[186588]: 2026-02-26 20:52:16.436 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:16 compute-0 podman[219739]: 2026-02-26 20:52:16.535747841 +0000 UTC m=+0.044729809 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:52:16 compute-0 podman[219740]: 2026-02-26 20:52:16.543647268 +0000 UTC m=+0.050033608 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, config_id=ceilometer_agent_compute, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute)
Feb 26 20:52:16 compute-0 podman[219738]: 2026-02-26 20:52:16.548165135 +0000 UTC m=+0.055753106 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:52:18 compute-0 nova_compute[186588]: 2026-02-26 20:52:18.573 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:52:18 compute-0 nova_compute[186588]: 2026-02-26 20:52:18.574 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:52:19 compute-0 nova_compute[186588]: 2026-02-26 20:52:19.794 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.214 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.215 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.245 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.343 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.344 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.351 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.352 186592 INFO nova.compute.claims [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.438 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.508 186592 DEBUG nova.compute.provider_tree [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.524 186592 DEBUG nova.scheduler.client.report [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.544 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.545 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.609 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.610 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.631 186592 INFO nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.659 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.791 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.793 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.794 186592 INFO nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Creating image(s)
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.795 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.795 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.796 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.820 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.872 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.873 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.873 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.884 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.937 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.938 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.955 186592 DEBUG nova.policy [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58204d2871684f63a7ba6a9f725d5791', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ca101c060f24e0da4913194059f2284', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.968 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.969 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:21 compute-0 nova_compute[186588]: 2026-02-26 20:52:21.969 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.009 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.010 186592 DEBUG nova.virt.disk.api [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Checking if we can resize image /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.011 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.052 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.053 186592 DEBUG nova.virt.disk.api [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Cannot resize image /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.053 186592 DEBUG nova.objects.instance [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'migration_context' on Instance uuid fa8cafdc-c2b5-4fe2-9e30-4a421b059492 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.081 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.082 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Ensure instance console log exists: /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.082 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.082 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:22 compute-0 nova_compute[186588]: 2026-02-26 20:52:22.082 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:23 compute-0 nova_compute[186588]: 2026-02-26 20:52:23.236 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Successfully created port: 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:52:24 compute-0 podman[219817]: 2026-02-26 20:52:24.565146707 +0000 UTC m=+0.082492415 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:52:24 compute-0 nova_compute[186588]: 2026-02-26 20:52:24.797 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.150 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Successfully updated port: 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.174 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.174 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquired lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.174 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.462 186592 DEBUG nova.compute.manager [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received event network-changed-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.462 186592 DEBUG nova.compute.manager [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Refreshing instance network info cache due to event network-changed-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.462 186592 DEBUG oslo_concurrency.lockutils [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:52:25 compute-0 nova_compute[186588]: 2026-02-26 20:52:25.738 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:52:26 compute-0 nova_compute[186588]: 2026-02-26 20:52:26.440 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.926 186592 DEBUG nova.network.neutron [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updating instance_info_cache with network_info: [{"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.953 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Releasing lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.954 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Instance network_info: |[{"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.954 186592 DEBUG oslo_concurrency.lockutils [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.954 186592 DEBUG nova.network.neutron [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Refreshing network info cache for port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.958 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Start _get_guest_xml network_info=[{"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.963 186592 WARNING nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.968 186592 DEBUG nova.virt.libvirt.host [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.969 186592 DEBUG nova.virt.libvirt.host [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.975 186592 DEBUG nova.virt.libvirt.host [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.976 186592 DEBUG nova.virt.libvirt.host [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.976 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.976 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.977 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.977 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.977 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.978 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.978 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.978 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.979 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.979 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.979 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.980 186592 DEBUG nova.virt.hardware [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.983 186592 DEBUG nova.virt.libvirt.vif [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1840230091',display_name='tempest-TestNetworkBasicOps-server-1840230091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1840230091',id=7,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLF3220QNXt7lodThpqO8604jBFFYoqnU24AZ/UjOyEdyt013rf8ROtkEa9zn0bVw66SvqxaTrWrVuw84lQ6tCwAvtJFjZZoE/WmnB8xgGjBQtBC6VXOVXmrgxUN/cFlwQ==',key_name='tempest-TestNetworkBasicOps-1289584',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-ygf2al9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:52:21Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=fa8cafdc-c2b5-4fe2-9e30-4a421b059492,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.984 186592 DEBUG nova.network.os_vif_util [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.985 186592 DEBUG nova.network.os_vif_util [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:52:27 compute-0 nova_compute[186588]: 2026-02-26 20:52:27.986 186592 DEBUG nova.objects.instance [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'pci_devices' on Instance uuid fa8cafdc-c2b5-4fe2-9e30-4a421b059492 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.007 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <uuid>fa8cafdc-c2b5-4fe2-9e30-4a421b059492</uuid>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <name>instance-00000007</name>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:name>tempest-TestNetworkBasicOps-server-1840230091</nova:name>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:52:27</nova:creationTime>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:user uuid="58204d2871684f63a7ba6a9f725d5791">tempest-TestNetworkBasicOps-280950565-project-member</nova:user>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:project uuid="6ca101c060f24e0da4913194059f2284">tempest-TestNetworkBasicOps-280950565</nova:project>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         <nova:port uuid="085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72">
Feb 26 20:52:28 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <system>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="serial">fa8cafdc-c2b5-4fe2-9e30-4a421b059492</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="uuid">fa8cafdc-c2b5-4fe2-9e30-4a421b059492</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </system>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <os>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </os>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <features>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </features>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.config"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:3b:67:1f"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <target dev="tap085f98f8-b1"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/console.log" append="off"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <video>
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </video>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:52:28 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:52:28 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:52:28 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:52:28 compute-0 nova_compute[186588]: </domain>
Feb 26 20:52:28 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.008 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Preparing to wait for external event network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.009 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.009 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.009 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.010 186592 DEBUG nova.virt.libvirt.vif [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-26T20:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1840230091',display_name='tempest-TestNetworkBasicOps-server-1840230091',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1840230091',id=7,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLF3220QNXt7lodThpqO8604jBFFYoqnU24AZ/UjOyEdyt013rf8ROtkEa9zn0bVw66SvqxaTrWrVuw84lQ6tCwAvtJFjZZoE/WmnB8xgGjBQtBC6VXOVXmrgxUN/cFlwQ==',key_name='tempest-TestNetworkBasicOps-1289584',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-ygf2al9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:52:21Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=fa8cafdc-c2b5-4fe2-9e30-4a421b059492,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.011 186592 DEBUG nova.network.os_vif_util [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.012 186592 DEBUG nova.network.os_vif_util [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.012 186592 DEBUG os_vif [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.013 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.014 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.014 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.017 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.018 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap085f98f8-b1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.018 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap085f98f8-b1, col_values=(('external_ids', {'iface-id': '085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:67:1f', 'vm-uuid': 'fa8cafdc-c2b5-4fe2-9e30-4a421b059492'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.033 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 NetworkManager[56360]: <info>  [1772139148.0350] manager: (tap085f98f8-b1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.036 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.038 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.039 186592 INFO os_vif [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1')
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.098 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.098 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.099 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] No VIF found with MAC fa:16:3e:3b:67:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.100 186592 INFO nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Using config drive
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.664 186592 INFO nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Creating config drive at /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.config
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.673 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpqegxwght execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.796 186592 DEBUG oslo_concurrency.processutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpqegxwght" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:52:28 compute-0 kernel: tap085f98f8-b1: entered promiscuous mode
Feb 26 20:52:28 compute-0 NetworkManager[56360]: <info>  [1772139148.8359] manager: (tap085f98f8-b1): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Feb 26 20:52:28 compute-0 ovn_controller[96598]: 2026-02-26T20:52:28Z|00103|binding|INFO|Claiming lport 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 for this chassis.
Feb 26 20:52:28 compute-0 ovn_controller[96598]: 2026-02-26T20:52:28Z|00104|binding|INFO|085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72: Claiming fa:16:3e:3b:67:1f 10.100.0.6
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.837 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 ovn_controller[96598]: 2026-02-26T20:52:28Z|00105|binding|INFO|Setting lport 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 ovn-installed in OVS
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.843 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.845 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.849 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:67:1f 10.100.0.6'], port_security=['fa:16:3e:3b:67:1f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fa8cafdc-c2b5-4fe2-9e30-4a421b059492', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73a224a0-91c7-45a0-a00c-65db0bb99179', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ca101c060f24e0da4913194059f2284', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86e8f2ec-3a35-4eb4-8b1b-7f6c85af908c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af54d277-9f4b-4357-988a-5344dd201d7a, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.850 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 in datapath 73a224a0-91c7-45a0-a00c-65db0bb99179 bound to our chassis
Feb 26 20:52:28 compute-0 ovn_controller[96598]: 2026-02-26T20:52:28Z|00106|binding|INFO|Setting lport 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 up in Southbound
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.852 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 73a224a0-91c7-45a0-a00c-65db0bb99179
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.868 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb3d542-08c3-4898-860f-42099b3ab023]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 systemd-udevd[219860]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:52:28 compute-0 NetworkManager[56360]: <info>  [1772139148.8863] device (tap085f98f8-b1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:52:28 compute-0 NetworkManager[56360]: <info>  [1772139148.8872] device (tap085f98f8-b1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.894 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[83d972f8-c9b0-42ca-bfa3-515e02d0271b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.899 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[54a7ecb7-cd50-466f-b98f-e2abf94c6f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.921 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[83f81a0e-5d17-47b8-a1d9-8fce0c5ea3bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.935 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[3261571c-20b2-4189-adad-4881581058bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73a224a0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:a5:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369302, 'reachable_time': 24309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219867, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.950 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f4cf4f-8252-49cd-9e8c-6043da8d1db1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap73a224a0-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369314, 'tstamp': 369314}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219868, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap73a224a0-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369316, 'tstamp': 369316}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219868, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.952 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73a224a0-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.954 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 nova_compute[186588]: 2026-02-26 20:52:28.955 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.956 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73a224a0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.956 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.956 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap73a224a0-90, col_values=(('external_ids', {'iface-id': 'ec7e2be4-d0ad-4655-b985-1fb327c75eec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:28 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:28.957 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:52:28 compute-0 systemd-machined[155924]: New machine qemu-8-instance-00000007.
Feb 26 20:52:28 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000007.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.344 186592 DEBUG nova.compute.manager [req-fcd6a9d8-9d7d-429c-aa7c-b46fa88be49d req-945bb4b3-c717-42f2-84ca-8f26f4588e4a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received event network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.344 186592 DEBUG oslo_concurrency.lockutils [req-fcd6a9d8-9d7d-429c-aa7c-b46fa88be49d req-945bb4b3-c717-42f2-84ca-8f26f4588e4a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.344 186592 DEBUG oslo_concurrency.lockutils [req-fcd6a9d8-9d7d-429c-aa7c-b46fa88be49d req-945bb4b3-c717-42f2-84ca-8f26f4588e4a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.345 186592 DEBUG oslo_concurrency.lockutils [req-fcd6a9d8-9d7d-429c-aa7c-b46fa88be49d req-945bb4b3-c717-42f2-84ca-8f26f4588e4a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.345 186592 DEBUG nova.compute.manager [req-fcd6a9d8-9d7d-429c-aa7c-b46fa88be49d req-945bb4b3-c717-42f2-84ca-8f26f4588e4a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Processing event network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.502 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139149.5015352, fa8cafdc-c2b5-4fe2-9e30-4a421b059492 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.502 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] VM Started (Lifecycle Event)
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.505 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.507 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.510 186592 INFO nova.virt.libvirt.driver [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Instance spawned successfully.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.510 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.530 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.535 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.539 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.539 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.540 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.540 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.540 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.541 186592 DEBUG nova.virt.libvirt.driver [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.575 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.575 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139149.5016482, fa8cafdc-c2b5-4fe2-9e30-4a421b059492 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.575 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] VM Paused (Lifecycle Event)
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.604 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.609 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139149.506737, fa8cafdc-c2b5-4fe2-9e30-4a421b059492 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.610 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] VM Resumed (Lifecycle Event)
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.620 186592 INFO nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Took 7.83 seconds to spawn the instance on the hypervisor.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.620 186592 DEBUG nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.645 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.648 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.683 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.695 186592 INFO nova.compute.manager [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Took 8.40 seconds to build instance.
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.729 186592 DEBUG oslo_concurrency.lockutils [None req-2c9d7eb4-95b8-45c6-b7a1-a09b7539bdb6 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:29 compute-0 podman[202527]: time="2026-02-26T20:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:52:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23221 "" "Go-http-client/1.1"
Feb 26 20:52:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3507 "" "Go-http-client/1.1"
Feb 26 20:52:29 compute-0 nova_compute[186588]: 2026-02-26 20:52:29.799 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.225 186592 DEBUG nova.network.neutron [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updated VIF entry in instance network info cache for port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.225 186592 DEBUG nova.network.neutron [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updating instance_info_cache with network_info: [{"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.247 186592 DEBUG oslo_concurrency.lockutils [req-f9514083-fd4a-4072-a543-791995ab7104 req-a1781827-7e83-4b52-b227-d0786a34eb7f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:31 compute-0 openstack_network_exporter[205682]: ERROR   20:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:52:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:52:31 compute-0 openstack_network_exporter[205682]: ERROR   20:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:52:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.487 186592 DEBUG nova.compute.manager [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received event network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.488 186592 DEBUG oslo_concurrency.lockutils [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.488 186592 DEBUG oslo_concurrency.lockutils [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.488 186592 DEBUG oslo_concurrency.lockutils [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.488 186592 DEBUG nova.compute.manager [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] No waiting events found dispatching network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:52:31 compute-0 nova_compute[186588]: 2026-02-26 20:52:31.488 186592 WARNING nova.compute.manager [req-bf6acecd-92ff-4ae2-885b-3e5f8d2ee0c2 req-f126f27f-dc59-4176-94aa-3b42b8570af1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received unexpected event network-vif-plugged-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 for instance with vm_state active and task_state None.
Feb 26 20:52:31 compute-0 podman[219885]: 2026-02-26 20:52:31.538551879 +0000 UTC m=+0.044286308 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:52:33 compute-0 nova_compute[186588]: 2026-02-26 20:52:33.035 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:34 compute-0 nova_compute[186588]: 2026-02-26 20:52:34.801 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:35 compute-0 nova_compute[186588]: 2026-02-26 20:52:35.258 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:35 compute-0 NetworkManager[56360]: <info>  [1772139155.2695] manager: (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Feb 26 20:52:35 compute-0 NetworkManager[56360]: <info>  [1772139155.2705] manager: (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 26 20:52:35 compute-0 nova_compute[186588]: 2026-02-26 20:52:35.279 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:35 compute-0 ovn_controller[96598]: 2026-02-26T20:52:35Z|00107|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:52:35 compute-0 nova_compute[186588]: 2026-02-26 20:52:35.288 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:35 compute-0 podman[219910]: 2026-02-26 20:52:35.533159667 +0000 UTC m=+0.050065188 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 26 20:52:36 compute-0 nova_compute[186588]: 2026-02-26 20:52:36.297 186592 DEBUG nova.compute.manager [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received event network-changed-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:36 compute-0 nova_compute[186588]: 2026-02-26 20:52:36.298 186592 DEBUG nova.compute.manager [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Refreshing instance network info cache due to event network-changed-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:52:36 compute-0 nova_compute[186588]: 2026-02-26 20:52:36.298 186592 DEBUG oslo_concurrency.lockutils [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:52:36 compute-0 nova_compute[186588]: 2026-02-26 20:52:36.298 186592 DEBUG oslo_concurrency.lockutils [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:52:36 compute-0 nova_compute[186588]: 2026-02-26 20:52:36.299 186592 DEBUG nova.network.neutron [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Refreshing network info cache for port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:52:38 compute-0 nova_compute[186588]: 2026-02-26 20:52:38.039 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:39 compute-0 nova_compute[186588]: 2026-02-26 20:52:39.377 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:39 compute-0 nova_compute[186588]: 2026-02-26 20:52:39.813 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:40 compute-0 nova_compute[186588]: 2026-02-26 20:52:40.038 186592 DEBUG nova.network.neutron [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updated VIF entry in instance network info cache for port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:52:40 compute-0 nova_compute[186588]: 2026-02-26 20:52:40.038 186592 DEBUG nova.network.neutron [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updating instance_info_cache with network_info: [{"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:40 compute-0 nova_compute[186588]: 2026-02-26 20:52:40.061 186592 DEBUG oslo_concurrency.lockutils [req-9d9e6071-793c-4bb7-a1a2-7be70ba5394e req-6a66bc0a-d25a-4a8f-86ea-212facf96d88 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-fa8cafdc-c2b5-4fe2-9e30-4a421b059492" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:52:41 compute-0 ovn_controller[96598]: 2026-02-26T20:52:41Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:67:1f 10.100.0.6
Feb 26 20:52:41 compute-0 ovn_controller[96598]: 2026-02-26T20:52:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:67:1f 10.100.0.6
Feb 26 20:52:43 compute-0 nova_compute[186588]: 2026-02-26 20:52:43.042 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:44 compute-0 ovn_controller[96598]: 2026-02-26T20:52:44Z|00108|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:52:44 compute-0 nova_compute[186588]: 2026-02-26 20:52:44.553 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:44 compute-0 nova_compute[186588]: 2026-02-26 20:52:44.815 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:45 compute-0 ovn_controller[96598]: 2026-02-26T20:52:45Z|00109|binding|INFO|Releasing lport ec7e2be4-d0ad-4655-b985-1fb327c75eec from this chassis (sb_readonly=0)
Feb 26 20:52:45 compute-0 nova_compute[186588]: 2026-02-26 20:52:45.023 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:46.522 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:46.522 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:46.523 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:47 compute-0 podman[219950]: 2026-02-26 20:52:47.547774016 +0000 UTC m=+0.062900463 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:52:47 compute-0 podman[219951]: 2026-02-26 20:52:47.548435324 +0000 UTC m=+0.054685470 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 26 20:52:47 compute-0 podman[219952]: 2026-02-26 20:52:47.573647122 +0000 UTC m=+0.084700683 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 26 20:52:48 compute-0 nova_compute[186588]: 2026-02-26 20:52:48.046 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:48 compute-0 nova_compute[186588]: 2026-02-26 20:52:48.818 186592 INFO nova.compute.manager [None req-cbd22c03-d289-4d92-8684-363098f297ad 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Get console output
Feb 26 20:52:48 compute-0 nova_compute[186588]: 2026-02-26 20:52:48.824 217717 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.156 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.156 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.156 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.157 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.157 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.158 186592 INFO nova.compute.manager [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Terminating instance
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.159 186592 DEBUG nova.compute.manager [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:52:49 compute-0 kernel: tap085f98f8-b1 (unregistering): left promiscuous mode
Feb 26 20:52:49 compute-0 NetworkManager[56360]: <info>  [1772139169.1912] device (tap085f98f8-b1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.197 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 ovn_controller[96598]: 2026-02-26T20:52:49Z|00110|binding|INFO|Releasing lport 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 from this chassis (sb_readonly=0)
Feb 26 20:52:49 compute-0 ovn_controller[96598]: 2026-02-26T20:52:49Z|00111|binding|INFO|Setting lport 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 down in Southbound
Feb 26 20:52:49 compute-0 ovn_controller[96598]: 2026-02-26T20:52:49Z|00112|binding|INFO|Removing iface tap085f98f8-b1 ovn-installed in OVS
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.206 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.208 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:67:1f 10.100.0.6'], port_security=['fa:16:3e:3b:67:1f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'fa8cafdc-c2b5-4fe2-9e30-4a421b059492', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73a224a0-91c7-45a0-a00c-65db0bb99179', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ca101c060f24e0da4913194059f2284', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86e8f2ec-3a35-4eb4-8b1b-7f6c85af908c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af54d277-9f4b-4357-988a-5344dd201d7a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.210 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 in datapath 73a224a0-91c7-45a0-a00c-65db0bb99179 unbound from our chassis
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.211 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 73a224a0-91c7-45a0-a00c-65db0bb99179
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.227 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[57ef385b-320b-408b-aa95-a0e074e4c5bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 26 20:52:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000007.scope: Consumed 11.972s CPU time.
Feb 26 20:52:49 compute-0 systemd-machined[155924]: Machine qemu-8-instance-00000007 terminated.
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.248 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[d6143617-9301-485b-855b-f75564ad9942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.251 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[68ce8fe2-4526-40b2-b3c5-92d11e97a2c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.271 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[11585679-84dc-488b-9d94-f02a83dd3b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.285 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[09ef66ec-4030-412a-a0fc-bd2321c580a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73a224a0-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:a5:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369302, 'reachable_time': 19429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220022, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.302 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[8aba6d01-d8a5-4082-a3f4-a5f73a863a07]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap73a224a0-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369314, 'tstamp': 369314}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220023, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap73a224a0-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369316, 'tstamp': 369316}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220023, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.305 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73a224a0-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.341 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.345 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.346 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73a224a0-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.347 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.347 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap73a224a0-90, col_values=(('external_ids', {'iface-id': 'ec7e2be4-d0ad-4655-b985-1fb327c75eec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:49 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:49.347 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.409 186592 INFO nova.virt.libvirt.driver [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Instance destroyed successfully.
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.410 186592 DEBUG nova.objects.instance [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'resources' on Instance uuid fa8cafdc-c2b5-4fe2-9e30-4a421b059492 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.423 186592 DEBUG nova.virt.libvirt.vif [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1840230091',display_name='tempest-TestNetworkBasicOps-server-1840230091',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1840230091',id=7,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLF3220QNXt7lodThpqO8604jBFFYoqnU24AZ/UjOyEdyt013rf8ROtkEa9zn0bVw66SvqxaTrWrVuw84lQ6tCwAvtJFjZZoE/WmnB8xgGjBQtBC6VXOVXmrgxUN/cFlwQ==',key_name='tempest-TestNetworkBasicOps-1289584',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:52:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-ygf2al9m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:52:29Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=fa8cafdc-c2b5-4fe2-9e30-4a421b059492,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.423 186592 DEBUG nova.network.os_vif_util [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "address": "fa:16:3e:3b:67:1f", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap085f98f8-b1", "ovs_interfaceid": "085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.424 186592 DEBUG nova.network.os_vif_util [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.424 186592 DEBUG os_vif [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.425 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.426 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap085f98f8-b1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.427 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.429 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.431 186592 INFO os_vif [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:67:1f,bridge_name='br-int',has_traffic_filtering=True,id=085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap085f98f8-b1')
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.431 186592 INFO nova.virt.libvirt.driver [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Deleting instance files /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492_del
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.432 186592 INFO nova.virt.libvirt.driver [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Deletion of /var/lib/nova/instances/fa8cafdc-c2b5-4fe2-9e30-4a421b059492_del complete
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.482 186592 INFO nova.compute.manager [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.482 186592 DEBUG oslo.service.loopingcall [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.483 186592 DEBUG nova.compute.manager [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.483 186592 DEBUG nova.network.neutron [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:52:49 compute-0 nova_compute[186588]: 2026-02-26 20:52:49.818 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.418 186592 DEBUG nova.network.neutron [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.436 186592 INFO nova.compute.manager [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Took 0.95 seconds to deallocate network for instance.
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.484 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.485 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.634 186592 DEBUG nova.compute.manager [req-e5c3275e-4098-4a4c-97cc-10aace693d39 req-9ea770ca-65d9-4202-857b-937bbf6d740f d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Received event network-vif-deleted-085f98f8-b1e1-4a44-bc6e-9a1c5fc10e72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.718 186592 DEBUG nova.compute.provider_tree [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.735 186592 DEBUG nova.scheduler.client.report [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.765 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.791 186592 INFO nova.scheduler.client.report [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Deleted allocations for instance fa8cafdc-c2b5-4fe2-9e30-4a421b059492
Feb 26 20:52:50 compute-0 nova_compute[186588]: 2026-02-26 20:52:50.866 186592 DEBUG oslo_concurrency.lockutils [None req-59f763cd-f233-4c6c-ac8f-5e8ea1000be8 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "fa8cafdc-c2b5-4fe2-9e30-4a421b059492" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.072 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.072 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.072 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.077 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance ee73d279-95d6-412b-a16e-4d435d4d4445 from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.078 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/ee73d279-95d6-412b-a16e-4d435d4d4445 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b3e4473a35ee7cfb8b21c33c4813d695abd797ae73e2596c86aebf485e87031c" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.636 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1851 Content-Type: application/json Date: Thu, 26 Feb 2026 20:52:51 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-49b16eb5-6c85-49f9-b75c-6153ab10afe9 x-openstack-request-id: req-49b16eb5-6c85-49f9-b75c-6153ab10afe9 _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.637 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "ee73d279-95d6-412b-a16e-4d435d4d4445", "name": "tempest-TestNetworkBasicOps-server-1999762607", "status": "ACTIVE", "tenant_id": "6ca101c060f24e0da4913194059f2284", "user_id": "58204d2871684f63a7ba6a9f725d5791", "metadata": {}, "hostId": "58de98aecca7cb35901f93a15870b78b2413cd6cf980f3289a8d097e", "image": {"id": "b79c8674-3f8a-4529-8bd8-8464687ab831", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/b79c8674-3f8a-4529-8bd8-8464687ab831"}]}, "flavor": {"id": "82d482ee-c2f1-4b05-aa1e-0019c8aae3df", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/82d482ee-c2f1-4b05-aa1e-0019c8aae3df"}]}, "created": "2026-02-26T20:51:39Z", "updated": "2026-02-26T20:51:48Z", "addresses": {"tempest-network-smoke--25532305": [{"version": 4, "addr": "10.100.0.4", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:4b:16:c7"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/ee73d279-95d6-412b-a16e-4d435d4d4445"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/ee73d279-95d6-412b-a16e-4d435d4d4445"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-TestNetworkBasicOps-130601821", "OS-SRV-USG:launched_at": "2026-02-26T20:51:48.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-secgroup-smoke-1032727887"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000006", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.637 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/ee73d279-95d6-412b-a16e-4d435d4d4445 used request id req-49b16eb5-6c85-49f9-b75c-6153ab10afe9 request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.637 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ee73d279-95d6-412b-a16e-4d435d4d4445', 'name': 'tempest-TestNetworkBasicOps-server-1999762607', 'flavor': {'id': '82d482ee-c2f1-4b05-aa1e-0019c8aae3df', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6ca101c060f24e0da4913194059f2284', 'user_id': '58204d2871684f63a7ba6a9f725d5791', 'hostId': '58de98aecca7cb35901f93a15870b78b2413cd6cf980f3289a8d097e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.638 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.638 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.638 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.638 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.638 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2026-02-26T20:52:51.638345) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.640 14 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ee73d279-95d6-412b-a16e-4d435d4d4445 / tap7841dfe2-eb inspect_vnics /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/inspector.py:143
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.outgoing.packets volume: 109 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.641 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2026-02-26T20:52:51.641812) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1999762607>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1999762607>]
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2026-02-26T20:52:51.642591) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.642 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.outgoing.bytes volume: 16018 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2026-02-26T20:52:51.643344) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.643 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2026-02-26T20:52:51.644042) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.644 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2026-02-26T20:52:51.644741) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2026-02-26T20:52:51.645360) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.645 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.646 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.646 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.646 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2026-02-26T20:52:51.646050) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.656 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.656 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.656 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 14 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.657 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2026-02-26T20:52:51.657664) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.670 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/cpu volume: 10940000000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.670 14 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.670 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.670 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.670 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.671 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.671 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.671 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.671 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2026-02-26T20:52:51.671272) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.671 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2026-02-26T20:52:51.672704) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.672 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.incoming.packets volume: 109 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.673 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.673 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.673 14 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.673 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.673 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.674 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.674 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2026-02-26T20:52:51.674043) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.674 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/memory.usage volume: 42.4375 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.674 14 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.674 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 14 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2026-02-26T20:52:51.675409) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.675 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/power.state volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2026-02-26T20:52:51.676711) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.676 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.677 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.677 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.677 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2026-02-26T20:52:51.678473) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.678 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.incoming.bytes volume: 19310 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.679 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.680 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2026-02-26T20:52:51.679945) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.705 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.706 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.706 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.706 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.latency volume: 556979833 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.707 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2026-02-26T20:52:51.707551) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.708 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.latency volume: 49895027 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.708 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.708 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2026-02-26T20:52:51.709457) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.709 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.requests volume: 1120 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.710 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.710 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.710 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.710 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.710 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.711 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.711 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.711 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2026-02-26T20:52:51.711338) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.711 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.711 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1999762607>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1999762607>]
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.712 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.712 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.712 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.712 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.712 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.713 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2026-02-26T20:52:51.712897) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.713 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.713 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.713 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.714 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2026-02-26T20:52:51.714791) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.715 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.715 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.715 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.716 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2026-02-26T20:52:51.716578) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.717 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.717 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.717 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.717 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.717 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.718 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.718 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2026-02-26T20:52:51.718106) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.718 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.latency volume: 2469451776 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.718 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.719 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.719 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.719 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.719 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.719 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.720 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.720 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2026-02-26T20:52:51.720103) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.720 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.720 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.721 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.721 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.721 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.721 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.722 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.722 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.722 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2026-02-26T20:52:51.722298) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.722 14 DEBUG ceilometer.compute.pollsters [-] ee73d279-95d6-412b-a16e-4d435d4d4445/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.723 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.724 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2026-02-26T20:52:51.723849) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.724 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.724 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.725 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.726 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.727 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:52:51.727 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.918 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.918 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.919 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.919 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.919 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.920 186592 INFO nova.compute.manager [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Terminating instance
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.921 186592 DEBUG nova.compute.manager [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:52:53 compute-0 kernel: tap7841dfe2-eb (unregistering): left promiscuous mode
Feb 26 20:52:53 compute-0 NetworkManager[56360]: <info>  [1772139173.9494] device (tap7841dfe2-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:52:53 compute-0 ovn_controller[96598]: 2026-02-26T20:52:53Z|00113|binding|INFO|Releasing lport 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 from this chassis (sb_readonly=0)
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.950 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:53 compute-0 ovn_controller[96598]: 2026-02-26T20:52:53Z|00114|binding|INFO|Setting lport 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 down in Southbound
Feb 26 20:52:53 compute-0 ovn_controller[96598]: 2026-02-26T20:52:53Z|00115|binding|INFO|Removing iface tap7841dfe2-eb ovn-installed in OVS
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.952 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:53 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:53.957 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:16:c7 10.100.0.4'], port_security=['fa:16:3e:4b:16:c7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ee73d279-95d6-412b-a16e-4d435d4d4445', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73a224a0-91c7-45a0-a00c-65db0bb99179', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ca101c060f24e0da4913194059f2284', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8e56c35-cecd-4c9b-9c6d-759a1e86e218', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af54d277-9f4b-4357-988a-5344dd201d7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:52:53 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:53.959 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 in datapath 73a224a0-91c7-45a0-a00c-65db0bb99179 unbound from our chassis
Feb 26 20:52:53 compute-0 nova_compute[186588]: 2026-02-26 20:52:53.961 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:53 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:53.961 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 73a224a0-91c7-45a0-a00c-65db0bb99179, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:52:53 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:53.964 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[8a914014-b30b-4ac0-9ced-e308bd2d2c0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:53 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:53.964 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179 namespace which is not needed anymore
Feb 26 20:52:53 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 26 20:52:53 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000006.scope: Consumed 13.803s CPU time.
Feb 26 20:52:53 compute-0 systemd-machined[155924]: Machine qemu-7-instance-00000006 terminated.
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [NOTICE]   (219543) : haproxy version is 2.8.14-c23fe91
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [NOTICE]   (219543) : path to executable is /usr/sbin/haproxy
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [WARNING]  (219543) : Exiting Master process...
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [WARNING]  (219543) : Exiting Master process...
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [ALERT]    (219543) : Current worker (219545) exited with code 143 (Terminated)
Feb 26 20:52:54 compute-0 neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179[219539]: [WARNING]  (219543) : All workers exited. Exiting... (0)
Feb 26 20:52:54 compute-0 systemd[1]: libpod-e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754.scope: Deactivated successfully.
Feb 26 20:52:54 compute-0 podman[220069]: 2026-02-26 20:52:54.088255363 +0000 UTC m=+0.044472412 container died e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 26 20:52:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754-userdata-shm.mount: Deactivated successfully.
Feb 26 20:52:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a61822ad26c8e6302d881c968f9bd2e96fd7d6ff1defbe12b90712345d985d3-merged.mount: Deactivated successfully.
Feb 26 20:52:54 compute-0 podman[220069]: 2026-02-26 20:52:54.137401977 +0000 UTC m=+0.093619036 container cleanup e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.137 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.141 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 systemd[1]: libpod-conmon-e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754.scope: Deactivated successfully.
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.164 186592 INFO nova.virt.libvirt.driver [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Instance destroyed successfully.
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.164 186592 DEBUG nova.objects.instance [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lazy-loading 'resources' on Instance uuid ee73d279-95d6-412b-a16e-4d435d4d4445 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.178 186592 DEBUG nova.virt.libvirt.vif [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:51:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1999762607',display_name='tempest-TestNetworkBasicOps-server-1999762607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1999762607',id=6,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOJpgzvFCBhKJA77kRP/y+2d1DQj9wIju+uMLwWW0Hqrnj4aub2UN6IfQq9Z5Mg3FKkr0KAtuere/W3G+BQNfRX8aTL62NPj2Jgxj/6WX+hN7XQ9xQVtinrV9qxNSqLWTA==',key_name='tempest-TestNetworkBasicOps-130601821',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:51:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6ca101c060f24e0da4913194059f2284',ramdisk_id='',reservation_id='r-1hfx1t60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-280950565',owner_user_name='tempest-TestNetworkBasicOps-280950565-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:51:48Z,user_data=None,user_id='58204d2871684f63a7ba6a9f725d5791',uuid=ee73d279-95d6-412b-a16e-4d435d4d4445,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.179 186592 DEBUG nova.network.os_vif_util [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converting VIF {"id": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "address": "fa:16:3e:4b:16:c7", "network": {"id": "73a224a0-91c7-45a0-a00c-65db0bb99179", "bridge": "br-int", "label": "tempest-network-smoke--25532305", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ca101c060f24e0da4913194059f2284", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7841dfe2-eb", "ovs_interfaceid": "7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.179 186592 DEBUG nova.network.os_vif_util [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.179 186592 DEBUG os_vif [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.181 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.181 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7841dfe2-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.182 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.183 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.186 186592 INFO os_vif [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:16:c7,bridge_name='br-int',has_traffic_filtering=True,id=7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4,network=Network(73a224a0-91c7-45a0-a00c-65db0bb99179),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7841dfe2-eb')
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.187 186592 INFO nova.virt.libvirt.driver [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Deleting instance files /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445_del
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.187 186592 INFO nova.virt.libvirt.driver [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Deletion of /var/lib/nova/instances/ee73d279-95d6-412b-a16e-4d435d4d4445_del complete
Feb 26 20:52:54 compute-0 podman[220106]: 2026-02-26 20:52:54.278189104 +0000 UTC m=+0.124456642 container remove e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.281 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[206dfb8d-1248-4c82-a5e3-be2d4759d5fc]: (4, ('Thu Feb 26 08:52:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179 (e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754)\ne361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754\nThu Feb 26 08:52:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179 (e361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754)\ne361b9975d664b3b195bc622acbd9c6e4365095ff958e852bbfcadaeb2da7754\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.283 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[b82f208c-bc8b-46c8-9d76-6c77f52bb67d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.283 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73a224a0-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:52:54 compute-0 kernel: tap73a224a0-90: left promiscuous mode
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.286 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.289 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.293 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[75ddb77b-dfc0-460f-a5b2-86ddf9a772bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.315 186592 INFO nova.compute.manager [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.315 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[58fe3361-3e87-4fe6-81b8-cf95040676c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.316 186592 DEBUG oslo.service.loopingcall [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.316 186592 DEBUG nova.compute.manager [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.317 186592 DEBUG nova.network.neutron [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.316 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[598ae1aa-e61e-4059-b255-7ef85e3eafaf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.329 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[406b741b-6ad9-41df-8da1-f34b11e2ec15]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369296, 'reachable_time': 34492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220132, 'error': None, 'target': 'ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d73a224a0\x2d91c7\x2d45a0\x2da00c\x2d65db0bb99179.mount: Deactivated successfully.
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.332 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-73a224a0-91c7-45a0-a00c-65db0bb99179 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:52:54 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:52:54.332 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb326dc-7423-4881-8bc8-8ff81e090dd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:52:54 compute-0 nova_compute[186588]: 2026-02-26 20:52:54.818 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.340 186592 DEBUG nova.network.neutron [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.359 186592 INFO nova.compute.manager [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Took 1.04 seconds to deallocate network for instance.
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.401 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.402 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.425 186592 DEBUG nova.compute.manager [req-f8d096b1-ce39-4575-8dd5-02fb6a555927 req-b1d5ef39-b54f-4de5-93ae-e5413a801fb7 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Received event network-vif-deleted-7841dfe2-eb7c-48e8-ae47-bea5c3d5d2a4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.446 186592 DEBUG nova.compute.provider_tree [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.459 186592 DEBUG nova.scheduler.client.report [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.479 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.498 186592 INFO nova.scheduler.client.report [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Deleted allocations for instance ee73d279-95d6-412b-a16e-4d435d4d4445
Feb 26 20:52:55 compute-0 nova_compute[186588]: 2026-02-26 20:52:55.548 186592 DEBUG oslo_concurrency.lockutils [None req-ea6571c1-8b1a-4bcf-90d8-948c1f9d615d 58204d2871684f63a7ba6a9f725d5791 6ca101c060f24e0da4913194059f2284 - - default default] Lock "ee73d279-95d6-412b-a16e-4d435d4d4445" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:52:55 compute-0 podman[220133]: 2026-02-26 20:52:55.557649283 +0000 UTC m=+0.065609155 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 26 20:52:58 compute-0 nova_compute[186588]: 2026-02-26 20:52:58.294 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:59 compute-0 nova_compute[186588]: 2026-02-26 20:52:59.183 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:59 compute-0 nova_compute[186588]: 2026-02-26 20:52:59.511 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:59 compute-0 nova_compute[186588]: 2026-02-26 20:52:59.542 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:52:59 compute-0 podman[202527]: time="2026-02-26T20:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:52:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:52:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3040 "" "Go-http-client/1.1"
Feb 26 20:52:59 compute-0 nova_compute[186588]: 2026-02-26 20:52:59.820 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:01 compute-0 openstack_network_exporter[205682]: ERROR   20:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:53:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:53:01 compute-0 openstack_network_exporter[205682]: ERROR   20:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:53:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:53:02 compute-0 podman[220160]: 2026-02-26 20:53:02.530645984 +0000 UTC m=+0.045951561 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 26 20:53:04 compute-0 nova_compute[186588]: 2026-02-26 20:53:04.185 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:04 compute-0 nova_compute[186588]: 2026-02-26 20:53:04.409 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139169.4083102, fa8cafdc-c2b5-4fe2-9e30-4a421b059492 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:53:04 compute-0 nova_compute[186588]: 2026-02-26 20:53:04.410 186592 INFO nova.compute.manager [-] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] VM Stopped (Lifecycle Event)
Feb 26 20:53:04 compute-0 nova_compute[186588]: 2026-02-26 20:53:04.435 186592 DEBUG nova.compute.manager [None req-dee3a21f-05e1-47c3-b1bd-05f9ef2b2ba5 - - - - - -] [instance: fa8cafdc-c2b5-4fe2-9e30-4a421b059492] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:04 compute-0 nova_compute[186588]: 2026-02-26 20:53:04.823 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:06 compute-0 podman[220184]: 2026-02-26 20:53:06.534584836 +0000 UTC m=+0.052429260 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., version=9.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, release=1770267347, config_id=openstack_network_exporter)
Feb 26 20:53:08 compute-0 nova_compute[186588]: 2026-02-26 20:53:08.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:08 compute-0 nova_compute[186588]: 2026-02-26 20:53:08.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:53:08 compute-0 nova_compute[186588]: 2026-02-26 20:53:08.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:53:08 compute-0 nova_compute[186588]: 2026-02-26 20:53:08.085 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.162 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139174.1621957, ee73d279-95d6-412b-a16e-4d435d4d4445 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.163 186592 INFO nova.compute.manager [-] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] VM Stopped (Lifecycle Event)
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.180 186592 DEBUG nova.compute.manager [None req-a412b5fa-5b15-4a65-bf46-63cf29118fc7 - - - - - -] [instance: ee73d279-95d6-412b-a16e-4d435d4d4445] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.187 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:09 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:09.651 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:53:09 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:09.652 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.652 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:09 compute-0 nova_compute[186588]: 2026-02-26 20:53:09.824 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:11 compute-0 nova_compute[186588]: 2026-02-26 20:53:11.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:12 compute-0 nova_compute[186588]: 2026-02-26 20:53:12.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:12 compute-0 nova_compute[186588]: 2026-02-26 20:53:12.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:13 compute-0 nova_compute[186588]: 2026-02-26 20:53:13.055 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:13 compute-0 nova_compute[186588]: 2026-02-26 20:53:13.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:14 compute-0 nova_compute[186588]: 2026-02-26 20:53:14.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:14 compute-0 nova_compute[186588]: 2026-02-26 20:53:14.214 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:14 compute-0 nova_compute[186588]: 2026-02-26 20:53:14.825 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:15 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:15.654 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.085 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.086 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.086 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.087 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.257 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.258 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=72.73987197875977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.258 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.259 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.318 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.319 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.334 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.346 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.365 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:53:17 compute-0 nova_compute[186588]: 2026-02-26 20:53:17.365 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:18 compute-0 podman[220208]: 2026-02-26 20:53:18.54181076 +0000 UTC m=+0.052717688 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 26 20:53:18 compute-0 podman[220207]: 2026-02-26 20:53:18.551180384 +0000 UTC m=+0.066696011 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 26 20:53:18 compute-0 podman[220209]: 2026-02-26 20:53:18.567424539 +0000 UTC m=+0.078357297 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Feb 26 20:53:19 compute-0 nova_compute[186588]: 2026-02-26 20:53:19.216 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:19 compute-0 nova_compute[186588]: 2026-02-26 20:53:19.826 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:20 compute-0 nova_compute[186588]: 2026-02-26 20:53:20.366 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:53:20 compute-0 nova_compute[186588]: 2026-02-26 20:53:20.366 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:53:24 compute-0 nova_compute[186588]: 2026-02-26 20:53:24.218 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:24 compute-0 nova_compute[186588]: 2026-02-26 20:53:24.828 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:26 compute-0 podman[220270]: 2026-02-26 20:53:26.553171895 +0000 UTC m=+0.065180442 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.873 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.874 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.889 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.960 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.960 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.970 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 26 20:53:27 compute-0 nova_compute[186588]: 2026-02-26 20:53:27.971 186592 INFO nova.compute.claims [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Claim successful on node compute-0.ctlplane.example.com
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.060 186592 DEBUG nova.compute.provider_tree [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.074 186592 DEBUG nova.scheduler.client.report [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.095 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.096 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.352 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.352 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.389 186592 INFO nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.412 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.494 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.495 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.495 186592 INFO nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Creating image(s)
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.495 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.496 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.496 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.515 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.533 186592 DEBUG nova.policy [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '713f1a36b6f94f8385695bf5d7923c04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbe792f8647846a2a575b93f32132584', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.557 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.558 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "baa7093f309b972dfc26ad2355b06df960c90d8a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.558 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.571 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.612 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.614 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.638 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a,backing_fmt=raw /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.639 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "baa7093f309b972dfc26ad2355b06df960c90d8a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.639 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.690 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/baa7093f309b972dfc26ad2355b06df960c90d8a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.691 186592 DEBUG nova.virt.disk.api [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Checking if we can resize image /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.691 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.734 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.735 186592 DEBUG nova.virt.disk.api [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Cannot resize image /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.735 186592 DEBUG nova.objects.instance [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lazy-loading 'migration_context' on Instance uuid 41218741-4043-4065-b907-a515e68ab4d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.752 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.753 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Ensure instance console log exists: /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.753 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.753 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:28 compute-0 nova_compute[186588]: 2026-02-26 20:53:28.753 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:29 compute-0 nova_compute[186588]: 2026-02-26 20:53:29.221 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:29 compute-0 podman[202527]: time="2026-02-26T20:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:53:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:53:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3043 "" "Go-http-client/1.1"
Feb 26 20:53:29 compute-0 nova_compute[186588]: 2026-02-26 20:53:29.828 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:30 compute-0 nova_compute[186588]: 2026-02-26 20:53:30.435 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Successfully created port: 7b6ced3e-d715-40d1-a692-bd78c7826343 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 26 20:53:31 compute-0 openstack_network_exporter[205682]: ERROR   20:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:53:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:53:31 compute-0 openstack_network_exporter[205682]: ERROR   20:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:53:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.718 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Successfully updated port: 7b6ced3e-d715-40d1-a692-bd78c7826343 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.739 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.740 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquired lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.740 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.839 186592 DEBUG nova.compute.manager [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-changed-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.839 186592 DEBUG nova.compute.manager [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Refreshing instance network info cache due to event network-changed-7b6ced3e-d715-40d1-a692-bd78c7826343. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:53:31 compute-0 nova_compute[186588]: 2026-02-26 20:53:31.839 186592 DEBUG oslo_concurrency.lockutils [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:53:32 compute-0 nova_compute[186588]: 2026-02-26 20:53:32.525 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 26 20:53:33 compute-0 podman[220312]: 2026-02-26 20:53:33.54894238 +0000 UTC m=+0.061150419 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:53:34 compute-0 nova_compute[186588]: 2026-02-26 20:53:34.223 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:34 compute-0 nova_compute[186588]: 2026-02-26 20:53:34.830 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.139 186592 DEBUG nova.network.neutron [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updating instance_info_cache with network_info: [{"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.158 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Releasing lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.158 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Instance network_info: |[{"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.158 186592 DEBUG oslo_concurrency.lockutils [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.159 186592 DEBUG nova.network.neutron [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Refreshing network info cache for port 7b6ced3e-d715-40d1-a692-bd78c7826343 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.161 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Start _get_guest_xml network_info=[{"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'guest_format': None, 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'image_id': 'b79c8674-3f8a-4529-8bd8-8464687ab831'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.164 186592 WARNING nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.168 186592 DEBUG nova.virt.libvirt.host [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.169 186592 DEBUG nova.virt.libvirt.host [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.171 186592 DEBUG nova.virt.libvirt.host [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.171 186592 DEBUG nova.virt.libvirt.host [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.171 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.171 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-26T20:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='82d482ee-c2f1-4b05-aa1e-0019c8aae3df',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-26T20:49:57Z,direct_url=<?>,disk_format='qcow2',id=b79c8674-3f8a-4529-8bd8-8464687ab831,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='e6068562706f4704b06eef53f5e2de5f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-26T20:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.172 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.172 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.172 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.172 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.173 186592 DEBUG nova.virt.hardware [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.177 186592 DEBUG nova.virt.libvirt.vif [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:53:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124458125',display_name='tempest-TestServerBasicOps-server-124458125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124458125',id=8,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxM9F6OsK/EFiDFp9m+RC2x6AOO0lee0+kc3m3dBlaDY3a+ixFGFHILJ/HIjNGvRWfhO+zQNpGpBIXMMM1CkncxYK3/Bfpx2CcZvGaHIO28J/rfipWfTnpTLyuR79hMSg==',key_name='tempest-TestServerBasicOps-1703881118',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbe792f8647846a2a575b93f32132584',ramdisk_id='',reservation_id='r-c2kqz309',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-477858634',owner_user_name='tempest-TestServerBasicOps-477858634-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:53:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='713f1a36b6f94f8385695bf5d7923c04',uuid=41218741-4043-4065-b907-a515e68ab4d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.178 186592 DEBUG nova.network.os_vif_util [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converting VIF {"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.178 186592 DEBUG nova.network.os_vif_util [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.179 186592 DEBUG nova.objects.instance [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lazy-loading 'pci_devices' on Instance uuid 41218741-4043-4065-b907-a515e68ab4d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.193 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] End _get_guest_xml xml=<domain type="kvm">
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <uuid>41218741-4043-4065-b907-a515e68ab4d3</uuid>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <name>instance-00000008</name>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <memory>131072</memory>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <vcpu>1</vcpu>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <metadata>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:name>tempest-TestServerBasicOps-server-124458125</nova:name>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:creationTime>2026-02-26 20:53:35</nova:creationTime>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:flavor name="m1.nano">
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:memory>128</nova:memory>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:disk>1</nova:disk>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:swap>0</nova:swap>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:ephemeral>0</nova:ephemeral>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:vcpus>1</nova:vcpus>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       </nova:flavor>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:owner>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:user uuid="713f1a36b6f94f8385695bf5d7923c04">tempest-TestServerBasicOps-477858634-project-member</nova:user>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:project uuid="dbe792f8647846a2a575b93f32132584">tempest-TestServerBasicOps-477858634</nova:project>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       </nova:owner>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:root type="image" uuid="b79c8674-3f8a-4529-8bd8-8464687ab831"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <nova:ports>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         <nova:port uuid="7b6ced3e-d715-40d1-a692-bd78c7826343">
Feb 26 20:53:35 compute-0 nova_compute[186588]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:         </nova:port>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       </nova:ports>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </nova:instance>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </metadata>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <sysinfo type="smbios">
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <system>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="manufacturer">RDO</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="product">OpenStack Compute</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="serial">41218741-4043-4065-b907-a515e68ab4d3</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="uuid">41218741-4043-4065-b907-a515e68ab4d3</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <entry name="family">Virtual Machine</entry>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </system>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </sysinfo>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <os>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <boot dev="hd"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <smbios mode="sysinfo"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </os>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <features>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <acpi/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <apic/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <vmcoreinfo/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </features>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <clock offset="utc">
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <timer name="pit" tickpolicy="delay"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <timer name="hpet" present="no"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </clock>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <cpu mode="host-model" match="exact">
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <topology sockets="1" cores="1" threads="1"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </cpu>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   <devices>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <disk type="file" device="disk">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <target dev="vda" bus="virtio"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <disk type="file" device="cdrom">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <driver name="qemu" type="raw" cache="none"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <source file="/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.config"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <target dev="sda" bus="sata"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </disk>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <interface type="ethernet">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <mac address="fa:16:3e:8e:1a:72"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <driver name="vhost" rx_queue_size="512"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <mtu size="1442"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <target dev="tap7b6ced3e-d7"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </interface>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <serial type="pty">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <log file="/var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/console.log" append="off"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </serial>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <video>
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <model type="virtio"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </video>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <input type="tablet" bus="usb"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <rng model="virtio">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <backend model="random">/dev/urandom</backend>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </rng>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="pci" model="pcie-root-port"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <controller type="usb" index="0"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     <memballoon model="virtio">
Feb 26 20:53:35 compute-0 nova_compute[186588]:       <stats period="10"/>
Feb 26 20:53:35 compute-0 nova_compute[186588]:     </memballoon>
Feb 26 20:53:35 compute-0 nova_compute[186588]:   </devices>
Feb 26 20:53:35 compute-0 nova_compute[186588]: </domain>
Feb 26 20:53:35 compute-0 nova_compute[186588]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.193 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Preparing to wait for external event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.194 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.194 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.194 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.195 186592 DEBUG nova.virt.libvirt.vif [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:53:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124458125',display_name='tempest-TestServerBasicOps-server-124458125',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124458125',id=8,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxM9F6OsK/EFiDFp9m+RC2x6AOO0lee0+kc3m3dBlaDY3a+ixFGFHILJ/HIjNGvRWfhO+zQNpGpBIXMMM1CkncxYK3/Bfpx2CcZvGaHIO28J/rfipWfTnpTLyuR79hMSg==',key_name='tempest-TestServerBasicOps-1703881118',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbe792f8647846a2a575b93f32132584',ramdisk_id='',reservation_id='r-c2kqz309',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-477858634',owner_user_name='tempest-TestServerBasicOps-477858634-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-26T20:53:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='713f1a36b6f94f8385695bf5d7923c04',uuid=41218741-4043-4065-b907-a515e68ab4d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.195 186592 DEBUG nova.network.os_vif_util [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converting VIF {"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.195 186592 DEBUG nova.network.os_vif_util [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.196 186592 DEBUG os_vif [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.196 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.196 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.197 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.199 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.199 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b6ced3e-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.199 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b6ced3e-d7, col_values=(('external_ids', {'iface-id': '7b6ced3e-d715-40d1-a692-bd78c7826343', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:1a:72', 'vm-uuid': '41218741-4043-4065-b907-a515e68ab4d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.200 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 NetworkManager[56360]: <info>  [1772139215.2015] manager: (tap7b6ced3e-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.202 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.206 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.207 186592 INFO os_vif [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7')
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.264 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.264 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.264 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] No VIF found with MAC fa:16:3e:8e:1a:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.265 186592 INFO nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Using config drive
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.733 186592 INFO nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Creating config drive at /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.config
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.740 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp22527gpb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.921 186592 DEBUG oslo_concurrency.processutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp22527gpb" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 26 20:53:35 compute-0 kernel: tap7b6ced3e-d7: entered promiscuous mode
Feb 26 20:53:35 compute-0 ovn_controller[96598]: 2026-02-26T20:53:35Z|00116|binding|INFO|Claiming lport 7b6ced3e-d715-40d1-a692-bd78c7826343 for this chassis.
Feb 26 20:53:35 compute-0 ovn_controller[96598]: 2026-02-26T20:53:35Z|00117|binding|INFO|7b6ced3e-d715-40d1-a692-bd78c7826343: Claiming fa:16:3e:8e:1a:72 10.100.0.11
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.961 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 NetworkManager[56360]: <info>  [1772139215.9617] manager: (tap7b6ced3e-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.963 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.965 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.967 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.974 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:1a:72 10.100.0.11'], port_security=['fa:16:3e:8e:1a:72 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '41218741-4043-4065-b907-a515e68ab4d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbe792f8647846a2a575b93f32132584', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0dd430dc-9b7c-4774-8837-214ec3049e46 ad7c47b1-3c1b-4461-97d1-3ebfc2a23676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb5008f7-b871-4a86-8c10-0d9474cf9977, chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=7b6ced3e-d715-40d1-a692-bd78c7826343) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.975 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 7b6ced3e-d715-40d1-a692-bd78c7826343 in datapath 1e235ab3-53aa-4e90-83dc-62b618e5dc61 bound to our chassis
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.977 105929 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1e235ab3-53aa-4e90-83dc-62b618e5dc61
Feb 26 20:53:35 compute-0 systemd-udevd[220354]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:53:35 compute-0 ovn_controller[96598]: 2026-02-26T20:53:35Z|00118|binding|INFO|Setting lport 7b6ced3e-d715-40d1-a692-bd78c7826343 ovn-installed in OVS
Feb 26 20:53:35 compute-0 ovn_controller[96598]: 2026-02-26T20:53:35Z|00119|binding|INFO|Setting lport 7b6ced3e-d715-40d1-a692-bd78c7826343 up in Southbound
Feb 26 20:53:35 compute-0 nova_compute[186588]: 2026-02-26 20:53:35.985 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.984 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[f5987f08-dde8-4c94-9c30-95bea813d560]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.986 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1e235ab3-51 in ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.988 217873 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1e235ab3-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.988 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[963875ba-e650-48c9-b863-8e2430f51224]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.988 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[d40c951c-bc5e-4d5e-947d-5b047897d249]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:35 compute-0 systemd-machined[155924]: New machine qemu-9-instance-00000008.
Feb 26 20:53:35 compute-0 NetworkManager[56360]: <info>  [1772139215.9943] device (tap7b6ced3e-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 26 20:53:35 compute-0 NetworkManager[56360]: <info>  [1772139215.9952] device (tap7b6ced3e-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 26 20:53:35 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:35.997 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[cebbc7b8-f45a-4184-b29c-02394de87e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000008.
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.008 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[4889635b-2b7d-4f8b-881e-28c8067002fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.024 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b29655-3c2e-4e84-bedb-0031a515db04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 NetworkManager[56360]: <info>  [1772139216.0305] manager: (tap1e235ab3-50): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.029 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6d3e08-bc0d-47cd-9359-f8f235e4f63b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 systemd-udevd[220359]: Network interface NamePolicy= disabled on kernel command line.
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.052 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[fc08bdac-6e44-474a-8fd3-8e2920a3e8d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.054 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0ea7a2-047e-464c-b48a-39f5c00a66ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 NetworkManager[56360]: <info>  [1772139216.0673] device (tap1e235ab3-50): carrier: link connected
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.068 217909 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9db5ae-7033-4286-94a6-46a6c2e35d85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.081 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[ebec6a42-a96f-4aea-9868-f4756524643b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e235ab3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:d7:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380262, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220388, 'error': None, 'target': 'ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.094 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[50f57b64-31b1-416b-96c7-af590e612f4c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:d729'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 380262, 'tstamp': 380262}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220389, 'error': None, 'target': 'ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.107 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[47f03d2e-c0ef-4b4e-8868-ff8497700b93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e235ab3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:d7:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380262, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220390, 'error': None, 'target': 'ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.127 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[c839c483-dc8f-4a61-b994-899983aefafc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.165 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[cb53f251-0142-43ae-8a19-393e80601f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.167 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e235ab3-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.167 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.168 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e235ab3-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.204 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:36 compute-0 NetworkManager[56360]: <info>  [1772139216.2058] manager: (tap1e235ab3-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 26 20:53:36 compute-0 kernel: tap1e235ab3-50: entered promiscuous mode
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.208 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1e235ab3-50, col_values=(('external_ids', {'iface-id': '7dd2cb57-eac7-44af-bf40-6194adad2978'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:53:36 compute-0 ovn_controller[96598]: 2026-02-26T20:53:36Z|00120|binding|INFO|Releasing lport 7dd2cb57-eac7-44af-bf40-6194adad2978 from this chassis (sb_readonly=0)
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.210 105929 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1e235ab3-53aa-4e90-83dc-62b618e5dc61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1e235ab3-53aa-4e90-83dc-62b618e5dc61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.210 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.211 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[8b5f2dac-1310-4cdf-8e2a-ec94144d46fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.212 105929 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: global
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     log         /dev/log local0 debug
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     log-tag     haproxy-metadata-proxy-1e235ab3-53aa-4e90-83dc-62b618e5dc61
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     user        root
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     group       root
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     maxconn     1024
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     pidfile     /var/lib/neutron/external/pids/1e235ab3-53aa-4e90-83dc-62b618e5dc61.pid.haproxy
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     daemon
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: defaults
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     log global
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     mode http
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     option httplog
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     option dontlognull
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     option http-server-close
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     option forwardfor
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     retries                 3
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     timeout http-request    30s
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     timeout connect         30s
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     timeout client          32s
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     timeout server          32s
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     timeout http-keep-alive 30s
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: listen listener
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     bind 169.254.169.254:80
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     server metadata /var/lib/neutron/metadata_proxy
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:     http-request add-header X-OVN-Network-ID 1e235ab3-53aa-4e90-83dc-62b618e5dc61
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 26 20:53:36 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:36.213 105929 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'env', 'PROCESS_TAG=haproxy-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1e235ab3-53aa-4e90-83dc-62b618e5dc61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.213 186592 DEBUG nova.compute.manager [req-2b08a6c6-a064-4ff9-b39e-c02fc9d2c88f req-7b5a55e2-84c4-47fe-80cc-5019be368598 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.214 186592 DEBUG oslo_concurrency.lockutils [req-2b08a6c6-a064-4ff9-b39e-c02fc9d2c88f req-7b5a55e2-84c4-47fe-80cc-5019be368598 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.214 186592 DEBUG oslo_concurrency.lockutils [req-2b08a6c6-a064-4ff9-b39e-c02fc9d2c88f req-7b5a55e2-84c4-47fe-80cc-5019be368598 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.215 186592 DEBUG oslo_concurrency.lockutils [req-2b08a6c6-a064-4ff9-b39e-c02fc9d2c88f req-7b5a55e2-84c4-47fe-80cc-5019be368598 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.215 186592 DEBUG nova.compute.manager [req-2b08a6c6-a064-4ff9-b39e-c02fc9d2c88f req-7b5a55e2-84c4-47fe-80cc-5019be368598 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Processing event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.216 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.452 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.453 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139216.451109, 41218741-4043-4065-b907-a515e68ab4d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.454 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] VM Started (Lifecycle Event)
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.460 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.464 186592 INFO nova.virt.libvirt.driver [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Instance spawned successfully.
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.465 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 26 20:53:36 compute-0 podman[220429]: 2026-02-26 20:53:36.58966993 +0000 UTC m=+0.049520224 container create 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.598 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.602 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.602 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.603 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.603 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.604 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.604 186592 DEBUG nova.virt.libvirt.driver [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.612 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:53:36 compute-0 systemd[1]: Started libpod-conmon-2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e.scope.
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.642 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.643 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139216.4525783, 41218741-4043-4065-b907-a515e68ab4d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.643 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] VM Paused (Lifecycle Event)
Feb 26 20:53:36 compute-0 systemd[1]: Started libcrun container.
Feb 26 20:53:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fb8e77de8c14331450c48e7225cc5a8b8f2f87df1a92df050add54ad644a21e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 26 20:53:36 compute-0 podman[220429]: 2026-02-26 20:53:36.563507456 +0000 UTC m=+0.023357710 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 26 20:53:36 compute-0 podman[220429]: 2026-02-26 20:53:36.663331523 +0000 UTC m=+0.123181767 container init 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.664 186592 INFO nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Took 8.17 seconds to spawn the instance on the hypervisor.
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.664 186592 DEBUG nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.666 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:36 compute-0 podman[220429]: 2026-02-26 20:53:36.669946046 +0000 UTC m=+0.129796280 container start 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.672 186592 DEBUG nova.virt.driver [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] Emitting event <LifecycleEvent: 1772139216.4596944, 41218741-4043-4065-b907-a515e68ab4d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.673 186592 INFO nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] VM Resumed (Lifecycle Event)
Feb 26 20:53:36 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [NOTICE]   (220465) : New worker (220471) forked
Feb 26 20:53:36 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [NOTICE]   (220465) : Loading success.
Feb 26 20:53:36 compute-0 podman[220442]: 2026-02-26 20:53:36.689612969 +0000 UTC m=+0.063558070 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, version=9.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.715 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.718 186592 DEBUG nova.compute.manager [None req-1fed5c16-9f3a-4706-8a3a-5330d53e484d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.731 186592 INFO nova.compute.manager [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Took 8.79 seconds to build instance.
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.745 186592 DEBUG oslo_concurrency.lockutils [None req-231e0164-c4c0-409a-9ba9-98eb780aa052 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.751 186592 DEBUG nova.network.neutron [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updated VIF entry in instance network info cache for port 7b6ced3e-d715-40d1-a692-bd78c7826343. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.752 186592 DEBUG nova.network.neutron [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updating instance_info_cache with network_info: [{"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:53:36 compute-0 nova_compute[186588]: 2026-02-26 20:53:36.770 186592 DEBUG oslo_concurrency.lockutils [req-0360ec46-8c71-445d-8e57-96468653e2b2 req-7709b8ef-e93e-4022-89d4-5f8ece1a2b67 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.276 186592 DEBUG nova.compute.manager [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.277 186592 DEBUG oslo_concurrency.lockutils [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.278 186592 DEBUG oslo_concurrency.lockutils [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.278 186592 DEBUG oslo_concurrency.lockutils [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.278 186592 DEBUG nova.compute.manager [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] No waiting events found dispatching network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:53:38 compute-0 nova_compute[186588]: 2026-02-26 20:53:38.278 186592 WARNING nova.compute.manager [req-bb166000-b64a-4f19-90b9-886360c8a419 req-9e79e13f-8f40-4f14-a4cb-0624d916fe45 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received unexpected event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 for instance with vm_state active and task_state None.
Feb 26 20:53:39 compute-0 nova_compute[186588]: 2026-02-26 20:53:39.834 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:39 compute-0 NetworkManager[56360]: <info>  [1772139219.8822] manager: (patch-br-int-to-provnet-f52058ba-9be8-4a41-969a-2d602f39045e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Feb 26 20:53:39 compute-0 NetworkManager[56360]: <info>  [1772139219.8833] manager: (patch-provnet-f52058ba-9be8-4a41-969a-2d602f39045e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Feb 26 20:53:39 compute-0 nova_compute[186588]: 2026-02-26 20:53:39.884 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:39 compute-0 nova_compute[186588]: 2026-02-26 20:53:39.897 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:39 compute-0 ovn_controller[96598]: 2026-02-26T20:53:39Z|00121|binding|INFO|Releasing lport 7dd2cb57-eac7-44af-bf40-6194adad2978 from this chassis (sb_readonly=0)
Feb 26 20:53:39 compute-0 nova_compute[186588]: 2026-02-26 20:53:39.910 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.201 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.353 186592 DEBUG nova.compute.manager [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-changed-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.354 186592 DEBUG nova.compute.manager [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Refreshing instance network info cache due to event network-changed-7b6ced3e-d715-40d1-a692-bd78c7826343. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.354 186592 DEBUG oslo_concurrency.lockutils [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.354 186592 DEBUG oslo_concurrency.lockutils [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquired lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 26 20:53:40 compute-0 nova_compute[186588]: 2026-02-26 20:53:40.354 186592 DEBUG nova.network.neutron [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Refreshing network info cache for port 7b6ced3e-d715-40d1-a692-bd78c7826343 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 26 20:53:41 compute-0 nova_compute[186588]: 2026-02-26 20:53:41.864 186592 DEBUG nova.network.neutron [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updated VIF entry in instance network info cache for port 7b6ced3e-d715-40d1-a692-bd78c7826343. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 26 20:53:41 compute-0 nova_compute[186588]: 2026-02-26 20:53:41.865 186592 DEBUG nova.network.neutron [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updating instance_info_cache with network_info: [{"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:53:41 compute-0 nova_compute[186588]: 2026-02-26 20:53:41.890 186592 DEBUG oslo_concurrency.lockutils [req-6b4e70a2-68a3-496e-95fb-e1c0d591293a req-162aabb7-2740-476e-9011-633397b6e7a2 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Releasing lock "refresh_cache-41218741-4043-4065-b907-a515e68ab4d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 26 20:53:44 compute-0 nova_compute[186588]: 2026-02-26 20:53:44.869 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:45 compute-0 nova_compute[186588]: 2026-02-26 20:53:45.205 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:46.523 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:53:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:46.524 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:53:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:46.524 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:53:48 compute-0 ovn_controller[96598]: 2026-02-26T20:53:48Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:1a:72 10.100.0.11
Feb 26 20:53:48 compute-0 ovn_controller[96598]: 2026-02-26T20:53:48Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:1a:72 10.100.0.11
Feb 26 20:53:49 compute-0 podman[220504]: 2026-02-26 20:53:49.535490875 +0000 UTC m=+0.048405731 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 26 20:53:49 compute-0 podman[220503]: 2026-02-26 20:53:49.541446392 +0000 UTC m=+0.053573438 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 26 20:53:49 compute-0 podman[220506]: 2026-02-26 20:53:49.556930821 +0000 UTC m=+0.063257963 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 26 20:53:49 compute-0 nova_compute[186588]: 2026-02-26 20:53:49.872 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:50 compute-0 nova_compute[186588]: 2026-02-26 20:53:50.206 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:54 compute-0 nova_compute[186588]: 2026-02-26 20:53:54.872 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:55 compute-0 nova_compute[186588]: 2026-02-26 20:53:55.208 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:53:57 compute-0 podman[220567]: 2026-02-26 20:53:57.590527455 +0000 UTC m=+0.101022214 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:57.894 106321 DEBUG eventlet.wsgi.server [-] (106321) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:57.898 106321 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: Accept: */*
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: Connection: close
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: Content-Type: text/plain
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: Host: 169.254.169.254
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: User-Agent: curl/7.84.0
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: X-Forwarded-For: 10.100.0.11
Feb 26 20:53:57 compute-0 ovn_metadata_agent[105924]: X-Ovn-Network-Id: 1e235ab3-53aa-4e90-83dc-62b618e5dc61 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.291 106321 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.292 106321 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.3942792
Feb 26 20:53:59 compute-0 haproxy-metadata-proxy-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220471]: 10.100.0.11:42156 [26/Feb/2026:20:53:57.890] listener listener/metadata 0/0/0/1401/1401 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.405 106321 DEBUG eventlet.wsgi.server [-] (106321) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.406 106321 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: Accept: */*
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: Connection: close
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: Content-Length: 100
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: Content-Type: application/x-www-form-urlencoded
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: Host: 169.254.169.254
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: User-Agent: curl/7.84.0
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: X-Forwarded-For: 10.100.0.11
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: X-Ovn-Network-Id: 1e235ab3-53aa-4e90-83dc-62b618e5dc61
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.649 106321 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 26 20:53:59 compute-0 haproxy-metadata-proxy-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220471]: 10.100.0.11:42166 [26/Feb/2026:20:53:59.404] listener listener/metadata 0/0/0/245/245 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Feb 26 20:53:59 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:53:59.650 106321 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.2443442
Feb 26 20:53:59 compute-0 podman[202527]: time="2026-02-26T20:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:53:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23221 "" "Go-http-client/1.1"
Feb 26 20:53:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3510 "" "Go-http-client/1.1"
Feb 26 20:53:59 compute-0 nova_compute[186588]: 2026-02-26 20:53:59.873 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:00 compute-0 nova_compute[186588]: 2026-02-26 20:54:00.211 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:01 compute-0 openstack_network_exporter[205682]: ERROR   20:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:54:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:54:01 compute-0 openstack_network_exporter[205682]: ERROR   20:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:54:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.719 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.720 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.720 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.721 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.721 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.722 186592 INFO nova.compute.manager [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Terminating instance
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.724 186592 DEBUG nova.compute.manager [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 26 20:54:01 compute-0 kernel: tap7b6ced3e-d7 (unregistering): left promiscuous mode
Feb 26 20:54:01 compute-0 NetworkManager[56360]: <info>  [1772139241.7570] device (tap7b6ced3e-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.765 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:01 compute-0 ovn_controller[96598]: 2026-02-26T20:54:01Z|00122|binding|INFO|Releasing lport 7b6ced3e-d715-40d1-a692-bd78c7826343 from this chassis (sb_readonly=0)
Feb 26 20:54:01 compute-0 ovn_controller[96598]: 2026-02-26T20:54:01Z|00123|binding|INFO|Setting lport 7b6ced3e-d715-40d1-a692-bd78c7826343 down in Southbound
Feb 26 20:54:01 compute-0 ovn_controller[96598]: 2026-02-26T20:54:01Z|00124|binding|INFO|Removing iface tap7b6ced3e-d7 ovn-installed in OVS
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.769 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:01.776 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:1a:72 10.100.0.11'], port_security=['fa:16:3e:8e:1a:72 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '41218741-4043-4065-b907-a515e68ab4d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbe792f8647846a2a575b93f32132584', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0dd430dc-9b7c-4774-8837-214ec3049e46 ad7c47b1-3c1b-4461-97d1-3ebfc2a23676', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.232'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb5008f7-b871-4a86-8c10-0d9474cf9977, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>], logical_port=7b6ced3e-d715-40d1-a692-bd78c7826343) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6af43f2d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.779 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:01.778 105929 INFO neutron.agent.ovn.metadata.agent [-] Port 7b6ced3e-d715-40d1-a692-bd78c7826343 in datapath 1e235ab3-53aa-4e90-83dc-62b618e5dc61 unbound from our chassis
Feb 26 20:54:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:01.780 105929 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e235ab3-53aa-4e90-83dc-62b618e5dc61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 26 20:54:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:01.783 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[9341d5e9-3fd3-48af-8c52-076f17618973]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:01 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:01.784 105929 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61 namespace which is not needed anymore
Feb 26 20:54:01 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 26 20:54:01 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000008.scope: Consumed 12.283s CPU time.
Feb 26 20:54:01 compute-0 systemd-machined[155924]: Machine qemu-9-instance-00000008 terminated.
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [NOTICE]   (220465) : haproxy version is 2.8.14-c23fe91
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [NOTICE]   (220465) : path to executable is /usr/sbin/haproxy
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [WARNING]  (220465) : Exiting Master process...
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [WARNING]  (220465) : Exiting Master process...
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [ALERT]    (220465) : Current worker (220471) exited with code 143 (Terminated)
Feb 26 20:54:01 compute-0 neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61[220450]: [WARNING]  (220465) : All workers exited. Exiting... (0)
Feb 26 20:54:01 compute-0 systemd[1]: libpod-2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e.scope: Deactivated successfully.
Feb 26 20:54:01 compute-0 podman[220620]: 2026-02-26 20:54:01.937556494 +0000 UTC m=+0.058578401 container died 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 26 20:54:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e-userdata-shm.mount: Deactivated successfully.
Feb 26 20:54:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-2fb8e77de8c14331450c48e7225cc5a8b8f2f87df1a92df050add54ad644a21e-merged.mount: Deactivated successfully.
Feb 26 20:54:01 compute-0 podman[220620]: 2026-02-26 20:54:01.974873831 +0000 UTC m=+0.095895758 container cleanup 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.980 186592 INFO nova.virt.libvirt.driver [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Instance destroyed successfully.
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.981 186592 DEBUG nova.objects.instance [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lazy-loading 'resources' on Instance uuid 41218741-4043-4065-b907-a515e68ab4d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.993 186592 DEBUG nova.virt.libvirt.vif [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-26T20:53:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124458125',display_name='tempest-TestServerBasicOps-server-124458125',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124458125',id=8,image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLxM9F6OsK/EFiDFp9m+RC2x6AOO0lee0+kc3m3dBlaDY3a+ixFGFHILJ/HIjNGvRWfhO+zQNpGpBIXMMM1CkncxYK3/Bfpx2CcZvGaHIO28J/rfipWfTnpTLyuR79hMSg==',key_name='tempest-TestServerBasicOps-1703881118',keypairs=<?>,launch_index=0,launched_at=2026-02-26T20:53:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dbe792f8647846a2a575b93f32132584',ramdisk_id='',reservation_id='r-c2kqz309',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b79c8674-3f8a-4529-8bd8-8464687ab831',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-477858634',owner_user_name='tempest-TestServerBasicOps-477858634-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-26T20:53:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='713f1a36b6f94f8385695bf5d7923c04',uuid=41218741-4043-4065-b907-a515e68ab4d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.994 186592 DEBUG nova.network.os_vif_util [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converting VIF {"id": "7b6ced3e-d715-40d1-a692-bd78c7826343", "address": "fa:16:3e:8e:1a:72", "network": {"id": "1e235ab3-53aa-4e90-83dc-62b618e5dc61", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1823645827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dbe792f8647846a2a575b93f32132584", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b6ced3e-d7", "ovs_interfaceid": "7b6ced3e-d715-40d1-a692-bd78c7826343", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.995 186592 DEBUG nova.network.os_vif_util [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.995 186592 DEBUG os_vif [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 26 20:54:01 compute-0 systemd[1]: libpod-conmon-2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e.scope: Deactivated successfully.
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.997 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:01 compute-0 nova_compute[186588]: 2026-02-26 20:54:01.998 186592 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b6ced3e-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.000 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.002 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.002 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.005 186592 INFO os_vif [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:1a:72,bridge_name='br-int',has_traffic_filtering=True,id=7b6ced3e-d715-40d1-a692-bd78c7826343,network=Network(1e235ab3-53aa-4e90-83dc-62b618e5dc61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b6ced3e-d7')
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.005 186592 INFO nova.virt.libvirt.driver [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Deleting instance files /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3_del
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.006 186592 INFO nova.virt.libvirt.driver [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Deletion of /var/lib/nova/instances/41218741-4043-4065-b907-a515e68ab4d3_del complete
Feb 26 20:54:02 compute-0 podman[220664]: 2026-02-26 20:54:02.039645765 +0000 UTC m=+0.042431324 container remove 2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.043 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[7764834d-df7f-4c71-9b5c-9f16bd251e37]: (4, ('Thu Feb 26 08:54:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61 (2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e)\n2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e\nThu Feb 26 08:54:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61 (2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e)\n2dda8f1b1e3b857d490badfee7a1aae3c4444e566f224ee7aff4f45fe1fd890e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.046 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[10830a3a-fe86-421d-9f69-a45958f6d04e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.047 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e235ab3-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.049 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:02 compute-0 kernel: tap1e235ab3-50: left promiscuous mode
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.055 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.062 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[041ded85-b250-40ab-a64c-c2293a14827a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.065 186592 INFO nova.compute.manager [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.066 186592 DEBUG oslo.service.loopingcall [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.066 186592 DEBUG nova.compute.manager [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.066 186592 DEBUG nova.network.neutron [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.075 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[c78f6809-d847-4668-b5f2-6fed6daf27b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.076 186592 DEBUG nova.compute.manager [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-unplugged-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.076 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[229f18a6-79f9-4a26-acc3-429d33093a92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.077 186592 DEBUG oslo_concurrency.lockutils [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.077 186592 DEBUG oslo_concurrency.lockutils [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.077 186592 DEBUG oslo_concurrency.lockutils [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.077 186592 DEBUG nova.compute.manager [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] No waiting events found dispatching network-vif-unplugged-7b6ced3e-d715-40d1-a692-bd78c7826343 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:54:02 compute-0 nova_compute[186588]: 2026-02-26 20:54:02.078 186592 DEBUG nova.compute.manager [req-3ed45d85-0909-43dc-96ed-df9b76e5850b req-ff0397bc-2e5b-4bfe-992e-f00fad98060a d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-unplugged-7b6ced3e-d715-40d1-a692-bd78c7826343 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.088 217873 DEBUG oslo.privsep.daemon [-] privsep: reply[5faed6d9-d900-4a1c-9704-d5bc011598d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380258, 'reachable_time': 34135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220681, 'error': None, 'target': 'ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d1e235ab3\x2d53aa\x2d4e90\x2d83dc\x2d62b618e5dc61.mount: Deactivated successfully.
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.095 106452 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1e235ab3-53aa-4e90-83dc-62b618e5dc61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 26 20:54:02 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:02.096 106452 DEBUG oslo.privsep.daemon [-] privsep: reply[31b95444-0d07-4b93-bb8d-4b0856651baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.017 186592 DEBUG nova.network.neutron [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.043 186592 INFO nova.compute.manager [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Took 0.98 seconds to deallocate network for instance.
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.107 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.108 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.163 186592 DEBUG nova.compute.provider_tree [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.178 186592 DEBUG nova.scheduler.client.report [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.199 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.229 186592 INFO nova.scheduler.client.report [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Deleted allocations for instance 41218741-4043-4065-b907-a515e68ab4d3
Feb 26 20:54:03 compute-0 nova_compute[186588]: 2026-02-26 20:54:03.474 186592 DEBUG oslo_concurrency.lockutils [None req-fa695f92-0652-4bca-b53d-f125fa6bb329 713f1a36b6f94f8385695bf5d7923c04 dbe792f8647846a2a575b93f32132584 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.234 186592 DEBUG nova.compute.manager [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.234 186592 DEBUG oslo_concurrency.lockutils [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Acquiring lock "41218741-4043-4065-b907-a515e68ab4d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.235 186592 DEBUG oslo_concurrency.lockutils [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.235 186592 DEBUG oslo_concurrency.lockutils [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] Lock "41218741-4043-4065-b907-a515e68ab4d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.236 186592 DEBUG nova.compute.manager [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] No waiting events found dispatching network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.236 186592 WARNING nova.compute.manager [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received unexpected event network-vif-plugged-7b6ced3e-d715-40d1-a692-bd78c7826343 for instance with vm_state deleted and task_state None.
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.237 186592 DEBUG nova.compute.manager [req-0bbd63c5-4e0a-476e-ba43-35738e7f7408 req-c9005553-dcb3-4576-bdbc-92512643d3c1 d0c264f4931b45d1a79d09e2bc41daec 984613d611454217acc4ea5d90677950 - - default default] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Received event network-vif-deleted-7b6ced3e-d715-40d1-a692-bd78c7826343 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 26 20:54:04 compute-0 podman[220682]: 2026-02-26 20:54:04.573441891 +0000 UTC m=+0.073171526 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 26 20:54:04 compute-0 nova_compute[186588]: 2026-02-26 20:54:04.875 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:07 compute-0 nova_compute[186588]: 2026-02-26 20:54:07.000 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:07 compute-0 podman[220706]: 2026-02-26 20:54:07.55182989 +0000 UTC m=+0.060765029 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 26 20:54:08 compute-0 nova_compute[186588]: 2026-02-26 20:54:08.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:08 compute-0 nova_compute[186588]: 2026-02-26 20:54:08.735 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:08 compute-0 nova_compute[186588]: 2026-02-26 20:54:08.774 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:09 compute-0 nova_compute[186588]: 2026-02-26 20:54:09.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:09 compute-0 nova_compute[186588]: 2026-02-26 20:54:09.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:54:09 compute-0 nova_compute[186588]: 2026-02-26 20:54:09.060 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:54:09 compute-0 nova_compute[186588]: 2026-02-26 20:54:09.134 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:54:09 compute-0 nova_compute[186588]: 2026-02-26 20:54:09.877 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:10 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:10.565 105929 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:c2:31', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:84:98:ae:7a:1c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 26 20:54:10 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:10.566 105929 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 26 20:54:10 compute-0 nova_compute[186588]: 2026-02-26 20:54:10.611 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:11 compute-0 nova_compute[186588]: 2026-02-26 20:54:11.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:12 compute-0 nova_compute[186588]: 2026-02-26 20:54:12.035 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:13 compute-0 nova_compute[186588]: 2026-02-26 20:54:13.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:13 compute-0 nova_compute[186588]: 2026-02-26 20:54:13.062 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:14 compute-0 nova_compute[186588]: 2026-02-26 20:54:14.062 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:14 compute-0 nova_compute[186588]: 2026-02-26 20:54:14.879 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:15 compute-0 nova_compute[186588]: 2026-02-26 20:54:15.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:16 compute-0 nova_compute[186588]: 2026-02-26 20:54:16.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:16 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:16.569 105929 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62bfa765-f40e-4724-bf05-2e8b811f0867, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 26 20:54:16 compute-0 nova_compute[186588]: 2026-02-26 20:54:16.979 186592 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772139241.9781063, 41218741-4043-4065-b907-a515e68ab4d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 26 20:54:16 compute-0 nova_compute[186588]: 2026-02-26 20:54:16.980 186592 INFO nova.compute.manager [-] [instance: 41218741-4043-4065-b907-a515e68ab4d3] VM Stopped (Lifecycle Event)
Feb 26 20:54:17 compute-0 nova_compute[186588]: 2026-02-26 20:54:17.077 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:17 compute-0 nova_compute[186588]: 2026-02-26 20:54:17.142 186592 DEBUG nova.compute.manager [None req-7e4bc203-e2d8-48d4-b7c6-659ba43f821d - - - - - -] [instance: 41218741-4043-4065-b907-a515e68ab4d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.071 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.118 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.119 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.120 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.120 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.285 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.287 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=72.7398567199707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.287 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.287 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.414 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.415 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.453 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.475 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.502 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.502 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:19 compute-0 nova_compute[186588]: 2026-02-26 20:54:19.921 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:20 compute-0 nova_compute[186588]: 2026-02-26 20:54:20.490 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:54:20 compute-0 nova_compute[186588]: 2026-02-26 20:54:20.491 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 26 20:54:20 compute-0 podman[220731]: 2026-02-26 20:54:20.546686303 +0000 UTC m=+0.055558731 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 26 20:54:20 compute-0 podman[220730]: 2026-02-26 20:54:20.566246731 +0000 UTC m=+0.082476744 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 26 20:54:20 compute-0 podman[220732]: 2026-02-26 20:54:20.571994753 +0000 UTC m=+0.083302256 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 26 20:54:22 compute-0 nova_compute[186588]: 2026-02-26 20:54:22.148 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:24 compute-0 nova_compute[186588]: 2026-02-26 20:54:24.923 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:27 compute-0 nova_compute[186588]: 2026-02-26 20:54:27.170 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:28 compute-0 podman[220790]: 2026-02-26 20:54:28.57506062 +0000 UTC m=+0.091808581 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 26 20:54:29 compute-0 podman[202527]: time="2026-02-26T20:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:54:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:54:29 compute-0 podman[202527]: @ - - [26/Feb/2026:20:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3037 "" "Go-http-client/1.1"
Feb 26 20:54:29 compute-0 nova_compute[186588]: 2026-02-26 20:54:29.989 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:31 compute-0 openstack_network_exporter[205682]: ERROR   20:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:54:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:54:31 compute-0 openstack_network_exporter[205682]: ERROR   20:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:54:31 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:54:32 compute-0 nova_compute[186588]: 2026-02-26 20:54:32.173 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:35 compute-0 nova_compute[186588]: 2026-02-26 20:54:35.019 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:35 compute-0 podman[220816]: 2026-02-26 20:54:35.535940662 +0000 UTC m=+0.046643644 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 26 20:54:37 compute-0 nova_compute[186588]: 2026-02-26 20:54:37.175 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:38 compute-0 podman[220840]: 2026-02-26 20:54:38.55548831 +0000 UTC m=+0.062836113 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 26 20:54:40 compute-0 nova_compute[186588]: 2026-02-26 20:54:40.022 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:42 compute-0 nova_compute[186588]: 2026-02-26 20:54:42.185 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:43 compute-0 ovn_controller[96598]: 2026-02-26T20:54:43Z|00125|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Feb 26 20:54:45 compute-0 nova_compute[186588]: 2026-02-26 20:54:45.025 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:46.524 105929 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:54:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:46.524 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:54:46 compute-0 ovn_metadata_agent[105924]: 2026-02-26 20:54:46.524 105929 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:54:47 compute-0 nova_compute[186588]: 2026-02-26 20:54:47.187 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:50 compute-0 nova_compute[186588]: 2026-02-26 20:54:50.026 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.072 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.073 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c0b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.073 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f1349f8cb60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.073 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e1b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c9e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ca10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ea20>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e270>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8caa0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8d2e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8cb90>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134b7a5460>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.074 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e4b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8ecc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c500>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8e510>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c590>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c5f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8fe00>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.075 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f1349f8cc20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c650>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c6b0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f1349f8e540>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8deb0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.076 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c710>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f1349f8cbf0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c770>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f134bf0afc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f1349f8e9f0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f1349f8c7d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f134a022990>] with cache [{}], pollster history [{'network.outgoing.packets': [], 'network.incoming.bytes.rate': [], 'network.incoming.bytes.delta': [], 'network.outgoing.bytes': [], 'network.outgoing.packets.error': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.077 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f1349f8c800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f1349f8ca70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f1349f8de20>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f134bf33770>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.078 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f1349f8fe60>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f1349f8c8c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f1349f8e480>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f1349f8ec90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f1349f8d340>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.079 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f1349f8e4e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f1349f8c440>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f1349f8c560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f1349f8c5c0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f1349f8cb00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f1349f8c620>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f1349f8c680>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f1349f8c980>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.080 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f1349f8c6e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f1349f8c740>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f1349f8c950>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f1349f8c7a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f1349f443b0>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.081 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.082 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.083 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 ceilometer_agent_compute[196312]: 2026-02-26 20:54:51.084 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Feb 26 20:54:51 compute-0 podman[220865]: 2026-02-26 20:54:51.532108981 +0000 UTC m=+0.046570124 container health_status 2451c6d27c1f5f8e63384db8c6cc976c6d478ec95bd83a2b8ac28b8188d99157 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 26 20:54:51 compute-0 podman[220867]: 2026-02-26 20:54:51.543574864 +0000 UTC m=+0.049635664 container health_status e68b30e8cd6ded22a848056b7aa1e6c175d5345171e92b1346f8e7e0fc9234bd (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=336c33a75b0986292736e60cb7ab23d4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 26 20:54:51 compute-0 podman[220866]: 2026-02-26 20:54:51.565630237 +0000 UTC m=+0.073204098 container health_status a8c1fd6a87ea1b956334c88f1d753ee98648333941c8e0e1273e7c0f3e7965b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 26 20:54:52 compute-0 nova_compute[186588]: 2026-02-26 20:54:52.190 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:55 compute-0 nova_compute[186588]: 2026-02-26 20:54:55.031 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:57 compute-0 nova_compute[186588]: 2026-02-26 20:54:57.195 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:54:59 compute-0 podman[220925]: 2026-02-26 20:54:59.573829029 +0000 UTC m=+0.082251086 container health_status c334df41e2f82a4a7a7cec611e42f130e379379203ea9fa522284ee64bfb6e25 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'eb4430eb1f0b11807132c0e900bb001d770b3239c5f73f8caf0538c69b550e99-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 26 20:54:59 compute-0 podman[202527]: time="2026-02-26T20:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 26 20:54:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 21988 "" "Go-http-client/1.1"
Feb 26 20:54:59 compute-0 podman[202527]: @ - - [26/Feb/2026:20:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3043 "" "Go-http-client/1.1"
Feb 26 20:55:00 compute-0 nova_compute[186588]: 2026-02-26 20:55:00.085 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:01 compute-0 openstack_network_exporter[205682]: ERROR   20:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 26 20:55:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:55:01 compute-0 openstack_network_exporter[205682]: ERROR   20:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 26 20:55:01 compute-0 openstack_network_exporter[205682]: 
Feb 26 20:55:02 compute-0 nova_compute[186588]: 2026-02-26 20:55:02.199 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:05 compute-0 nova_compute[186588]: 2026-02-26 20:55:05.089 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:06 compute-0 podman[220952]: 2026-02-26 20:55:06.538537664 +0000 UTC m=+0.047951649 container health_status 8fd0e820317307df52af3019e4057632d82f5f2deeaf4f1ac65d7670f61f0d33 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 26 20:55:07 compute-0 nova_compute[186588]: 2026-02-26 20:55:07.202 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:07 compute-0 sshd-session[220976]: Accepted publickey for zuul from 192.168.122.10 port 43962 ssh2: ECDSA SHA256:y5IytN6WUHnPcgmx9s32+gtJBlPqK+SbSV4XY5V2Bd0
Feb 26 20:55:07 compute-0 systemd-logind[825]: New session 26 of user zuul.
Feb 26 20:55:07 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 26 20:55:07 compute-0 sshd-session[220976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 26 20:55:07 compute-0 sudo[220980]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 26 20:55:07 compute-0 sudo[220980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 26 20:55:08 compute-0 podman[221014]: 2026-02-26 20:55:08.671749223 +0000 UTC m=+0.073236098 container health_status ec7e6c0ca6a695343497e85c875512902a723165bb7ee885a87187586f8754e0 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '2009eab931001e99deabeae489866f32d530ab4fa97692f8c8c199ef3d72436d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, name=ubi9/ubi-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 26 20:55:10 compute-0 nova_compute[186588]: 2026-02-26 20:55:10.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:10 compute-0 nova_compute[186588]: 2026-02-26 20:55:10.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 26 20:55:10 compute-0 nova_compute[186588]: 2026-02-26 20:55:10.061 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 26 20:55:10 compute-0 nova_compute[186588]: 2026-02-26 20:55:10.083 186592 DEBUG nova.compute.manager [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 26 20:55:10 compute-0 nova_compute[186588]: 2026-02-26 20:55:10.091 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:12 compute-0 ovs-vsctl[221167]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 26 20:55:12 compute-0 nova_compute[186588]: 2026-02-26 20:55:12.204 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:12 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 221004 (sos)
Feb 26 20:55:12 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 26 20:55:12 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 26 20:55:12 compute-0 virtqemud[185803]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 26 20:55:12 compute-0 virtqemud[185803]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 26 20:55:12 compute-0 virtqemud[185803]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 26 20:55:13 compute-0 nova_compute[186588]: 2026-02-26 20:55:13.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:13 compute-0 crontab[221570]: (root) LIST (root)
Feb 26 20:55:14 compute-0 nova_compute[186588]: 2026-02-26 20:55:14.061 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:15 compute-0 nova_compute[186588]: 2026-02-26 20:55:15.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:15 compute-0 nova_compute[186588]: 2026-02-26 20:55:15.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:15 compute-0 nova_compute[186588]: 2026-02-26 20:55:15.094 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:15 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 26 20:55:15 compute-0 systemd[1]: Starting Hostname Service...
Feb 26 20:55:15 compute-0 systemd[1]: Started Hostname Service.
Feb 26 20:55:17 compute-0 nova_compute[186588]: 2026-02-26 20:55:17.054 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:17 compute-0 nova_compute[186588]: 2026-02-26 20:55:17.207 186592 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 26 20:55:18 compute-0 nova_compute[186588]: 2026-02-26 20:55:18.059 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.060 186592 DEBUG oslo_service.periodic_task [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.225 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.225 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.225 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.225 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.344 186592 WARNING nova.virt.libvirt.driver [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.346 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5407MB free_disk=72.48788452148438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.346 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.346 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.602 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.603 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.633 186592 DEBUG nova.compute.provider_tree [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed in ProviderTree for provider: 895ba9a7-707f-4e79-9130-ec9b9afa47ee update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.660 186592 DEBUG nova.scheduler.client.report [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Inventory has not changed for provider 895ba9a7-707f-4e79-9130-ec9b9afa47ee based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.661 186592 DEBUG nova.compute.resource_tracker [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 26 20:55:19 compute-0 nova_compute[186588]: 2026-02-26 20:55:19.661 186592 DEBUG oslo_concurrency.lockutils [None req-dd5150d9-d3b4-45c6-b1b7-af9871a5cd99 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
